Science.gov

Sample records for galfa-hi survey techniques

  1. Results from the Arecibo Galactic HI Survey (GALFA-HI)

    NASA Astrophysics Data System (ADS)

    Begum, Ayesha; Ballering, N.; Stanimirovic, S.; Douglas, K.; Gibson, S. J.; Grcevich, J.; Heiles, C.; Korpela, E.; Lee, M.; Peek, J. E. G.; Putman, M. E.

    2009-12-01

    The consortium for Galactic studies with the Arecibo L-band Feed Array (ALFA) is conducting a neutral hydrogen survey of the whole Arecibo sky (declination range from -1 to 38 deg), over a velocity range of -700 to +700 km/s, with high angular (3.5 arcmin) and velocity resolution (0.2 km/s). We present highlights from TOGS (Turn On GALFA Survey), the largest portion of GALFA-HI, which is covering thousands of square degrees in commensal drift scan observations with the ALFALFA and AGES extragalactic ALFA surveys. The unprecedented resolution and sensitivity of our survey resulted in the detection of numerous isolated, ultra-compact HI clouds at low Galactic velocities, which are distinctly separated from the HI disk emission. We will discuss properties of this population, and their role in the interplay between the Galactic disk and halo.

  2. THE GALFA-HI SURVEY: DATA RELEASE 1

    SciTech Connect

    Peek, J. E. G.; Grcevich, Jana; Putman, M. E.; Saul, Destry; Heiles, Carl; Douglas, Kevin A.; Lee, Min-Young; Stanimirovic, Snezana; Begum, Ayesha; Korpela, Eric J.; Gibson, Steven J.; Robishaw, Timothy; Krco, Marko

    2011-06-01

    We present the Galactic Arecibo L-Band Feed Array H I (GALFA-H I) survey and its first full data release (DR1). GALFA-H I is a high-resolution ({approx}4'), large-area (13,000 deg{sup 2}), high spectral resolution (0.18 km s{sup -1}), and wide band (-700 km s {sup -1} < v{sub LSR} < +700 km s{sup -1}) survey of the Galactic interstellar medium in the 21 cm line hyperfine transition of neutral hydrogen conducted at Arecibo Observatory. Typical noise levels are 80 mK rms in an integrated 1 km s{sup -1} channel. GALFA-H I is a dramatic step forward in high-resolution, large-area Galactic H I surveys, and we compare GALFA-H I to past, present, and future Galactic H I surveys. We describe in detail new techniques we have developed to reduce these data in the presence of fixed pattern noise, gain variation, and inconsistent beam shapes, and we show how we have largely mitigated these effects. We present our first full data release, covering 7520 deg{sup 2} of sky and representing 3046 hr of integration time, and discuss the details of these data.

  3. The GALFA-HI Survey: Probing the Anatomy of Galactic Neutral Hydrogen

    NASA Astrophysics Data System (ADS)

    Pingel, Nickolas

    2011-01-01

    The Galactic HI survey with the Arecibo L-band Feed Array (GALFA-HI) is observing the whole Arecibo sky (about 13,000 square degrees), with high angular (3.5 arcmin) and velocity resolution (0.2 km s-1). The unprecedented angular and velocity resolution allow studies of the Galactic gaseous disk, halo, and the flow of material between them. The survey operates mainly commensally with other ALFA surveys, saving thousands of hours of telescope time. The 7-beam feed array ALFA is tuned to the hyperfine transition of HI at 1420.405 MHz, and a specially developed method the Least-Squares Frequency Switching is used for bandpass fitting. The survey uses a combination of basket-weave and drift observing modes, and the final data products are in the form of RA x Dec x Velocity data cubes. From these cubes we can create detailed images of the Galactic HI, and other galaxies. The reduced data are released publicly at http://sites.google.com/site/galfahi/data. We present the current survey and data release status and examples of GALFA-HI images in different visualization modes. This research was partially funded by the NSF grant #AST-0707679 and the Research Corporation for Science Advancement. ---------------------- N. Pingel1, S. Stanimirovic, A. Begum1, K. A. Douglas2, S. J. Gibson3, J. Grcevich4, C. Heiles5, M. Lee1, E. J. Korpela5, J. Peek4, M. Putman4, D. Saul4 1UW-Madison, 2Univ. of Exeter, United Kingdom, 3Western Kentucky University, 4Columbia University, 5UC-Berkeley.

  4. Column Density Maps of the I-GALFA HI Survey: Evidence for Dark Gas?

    NASA Astrophysics Data System (ADS)

    Gibson, Steven J.; Koo, B.; Douglas, K. A.; Newton, J. H.; Peek, J. E.; Hughes, J. M.; Spraggs, M.; Park, G.; Kang, J.; Heiles, C. E.; Korpela, E. J.

    2014-01-01

    The gas in galactic disks, including our own, occurs in a wide range of temperatures and densities, most of which are unsuitable for star formation. Somehow, diffuse atomic clouds are collected into colder, denser molecular clouds that can collapse under their own gravity. The molecular condensation process is not directly observable, and the gas itself is often ``dark'' to standard probes like optically thin HI 21cm emission or the CO 2.6mm line. However, the presence of this dark gas can often be inferred from infrared dust emission in excess of what is expected for the observed HI and CO content. We have mapped apparent HI column densities in the Inner-Galaxy Arecibo L-band Feed Array (I-GALFA) survey, which covers a 1600 square degree region at 4-arcminute resolution in the first Galactic quadrant. We compare these ``naive'' HI columns to others derived from Planck first-release CO and dust maps and NE2001 model dispersion measures to identify a number of areas with potentially significant dark gas. We discuss whether optically thick HI or CO-free H2 is more likely to dominate the dark column, and we consider the effects of possible biases on our results. We acknowledge support from the National Science Foundation, the NASA Kentucky Space Grant Consortium, Western Kentucky University, and the Gatton Academy. I-GALFA (www.naic.edu igalfa) is a GALFA-HI survey observed with the 7-beam ALFA receiver on the 305-meter William E. Gordon Telescope. The Arecibo Observatory is a U.S. National Science Foundation facility operated under sequential cooperative agreements with Cornell University and SRI International, the latter in alliance with the Ana G. Mendez-Universidad Metropolitana and the Universities Space Research Association.

  5. VizieR Online Data Catalog: The GALFA-HI compact cloud catalog (Saul+, 2012)

    NASA Astrophysics Data System (ADS)

    Saul, D. R.; Peek, J. E. G.; Grcevich, J.; Putman, M. E.; Douglas, K. A.; Korpela, E. J.; Stanimirovic, S.; Heiles, C.; Gibson, S. J.; Lee, M.; Begum, A.; Brown, A. R. H.; Burkhart, B.; Hamden, E. T.; Pingel, N. M.; Tonnesen, S.

    2014-07-01

    The catalog is generated using the GALFA-HI Survey DR1. GALFA-HI is a survey of the 1420MHz hyperfine transition of neutral hydrogen in the Galaxy using the Arecibo 305m telescope and the ALFA seven-beam feed array. The survey is completed commensally with other Arecibo extragalactic and Galactic surveys (Giovanelli et al. 2005AJ....130.2598G; Guram & Taylor 2009ASPC..407..282G). GALFA-HI data provide a channel spacing of 0.184km/s and cover a velocity range of ±650km/s in the local standard of rest (LSR) with a spatial resolution of 4'. The DR1 data cover 7520deg2 of sky in an area between δ=38° and δ=-1° (see the bottom panel of Figure 1, and Figure 2), with a range of sensitivity from 120mK to 50mK in 0.74km/s channels. The details of GALFA-HI observing and data reduction, along with the specifics of the DR1 data set, can be found in Peek et al. (2011ApJS..194...20P). (1 data file).

  6. Tiny, Dusty, Galactic HI Clouds: The GALFA-HI Compact Cloud Catalog

    NASA Astrophysics Data System (ADS)

    Saul, Destry R.; Putman, M. E.; Peek, J. G.

    2013-01-01

    The recently published GALFA-HI Compact Cloud Catalog contains 2000 nearby neutral hydrogen clouds under 20' in angular size detected with a machine-vision algorithm in the Galactic Arecibo L-Band Feed Array HI survey (GALFA-HI). At a distance of 1kpc, the compact clouds would typically be 1 solar mass and 1pc in size. We observe that nearly all of the compact clouds that are classified as high velocity (> 90 km/s) are near previously-identified high velocity complexes. We separate the compact clouds into populations based on velocity, linewidth, and position. We have begun to search for evidence of dust in these clouds using IRIS and have detections in several populations.

  7. A Sharper View of MBM 53-55 in GALFA HI Emission

    NASA Astrophysics Data System (ADS)

    Gibson, Steven J.; Korpela, E. J.; Stanimirovic, S.; Heiles, C.; Douglas, K. A.; Peek, J. E. G.; Putman, M.

    2007-12-01

    Molecular clouds are enmeshed in webs of diffuse atomic gas that contain important information about their formation and interaction with their environment, not least because the atomic gas can trace lower column densities that are unshielded from UV radiation. The nearby molecular cloud complex MBM 53-55 (Magnani et al. 1985; Yamamoto et al. 2003) is thought to be part of an expanding shell from either stellar winds or a past supernova (Gir et al. 1994). Although this complex subtends some 15 degrees on the sky, its atomic gas has not previously been imaged at sub-parsec scales. The GALFA HI sky survey of Galactic 21cm-line emission with the Arecibo L-band Feed Array has recently mapped the MBM 53-55 region with a 3.5-arcminute beam and 0.2 km/s channels. GALFA's high resolution and sensitivity allow the HI content, environment, and kinematics of these clouds to be explored as never before. The observed HI structure matches IRAS dust emission so well that it is hard to identify IRAS filaments without HI counterparts: the ALFA survey essentially adds a velocity axis to IRAS maps. Most high column density HI features also appear to have CO emission counterparts, but there are also disagreements that may trace the gas phase evolution. We will consider the implications of the GALFA HI data for the origins of the shell-like structure and the formation of the molecular clouds.

  8. GALFA HI: Candidate Sites for H2 Formation in Cold HI Emission and Other Tracers

    NASA Astrophysics Data System (ADS)

    Newton, Jonathan; Gibson, S. J.; Douglas, K. A.; Koo, B.; Kang, J.; Park, G.; Peek, J. E. G.; Korpela, E. J.; Heiles, C.; Dame, T. M.

    2012-01-01

    Interstellar gas has a variety of temperature phases, but only the coldest clouds are dense enough to collapse gravitationally and form stars. How do such clouds form? A key step in this process is the transition from neutral atomic hydrogen (HI) to molecular hydrogen (H2). To identify candidate sites where this HI-to-H2 transition may be underway, we have developed a method of fitting isolated HI 21cm emission features to constrain their spin temperature and other properties vs. position 21cm-line data cubes. Our method uses the Nelder-Meade `amoeba' method to solve the relevant radiative transfer equation by identifying the absolute chi-squared minimum in the parameter space. As other investigators have noted, this approach requires a very high signal-to-noise ratio, so we are using sensitive Arecibo L-band Feed Array (ALFA) observations, starting with narrow-line HI emission clouds in the inner-Galaxy ALFA (I-GALFA) survey, and we have also tested the reliability of our method with a large suite of model spectra. Cold HI clouds confirmed by the fit will be compared to tracers of molecular gas, including CO lines and FIR dust emission. The I-GALFA survey is part of the Galactic ALFA HI data set obtained with the Arecibo 305m telescope. Arecibo Observatory is part of the National Astronomy and Ionosphere Center, operated sequentially by Cornell University and Stanford Research Institute under Cooperative Agreement with the U.S. National Science Foundation.

  9. GALFA-HI: A Targeted Search For Star Formation on the Far Side of the Milky Way

    NASA Astrophysics Data System (ADS)

    Stantzos, Nicholas; Gostisha, M.; Benjamin, R.; Gibson, S.; Koo, B.; Douglas, K. A.; Kang, J.; Park, G.; Peek, J. E. G.; Korpela, E. J.; Heiles, C.; Newton, J. H.

    2012-01-01

    The I-GALFA Survey provides a unique window on the spiral structure of the Milky Way as it contains three coherent 21 cm features that have been identified as spiral arms: the Perseus Arm, the Outer Arm, and the recently discovered Outer Scutum-Centaurus Arm. Moreover, all three of these arms lie beyond the solar circle (although the Perseus arm is thought to cross interior to the solar circle for l< 50 degrees), so this gas does not suffer the kinematic distance ambiguity encountered in the inner Galaxy. We use this data and the CO surveys compiled by Dame et al (2001) to target a search for distant star formation regions seen in the Spitzer Space Telescope/GLIMPSE and WISE mid-infrared all-sky surveys. We characterize the HI arms, and present the star formation regions that may be potentially associated with these three arms. Many of these objects will need spectroscopic follow-up, but some have been previously identified in the Green Bank Telescope HII Region Discovery Survey of Anderson et al (2011). The Inner Galaxy ALFA (I-GALFA) survey is part of the Galactic ALFA HI data set obtained with the Arecibo L-band Feed Array (ALFA) on the Arecibo 305m telescope. Arecibo Observatory is part of the National Astronomy and Ionosphere Center, operated sequentially by Cornell University and Stanford Research Institute under Cooperative Agreement with the U.S. National Science Foundation.

  10. Laryngoscope decontamination techniques: A survey

    PubMed Central

    Chawla, Rajiv; Gupta, Akhilesh; Gupta, Anshu; Kumar, Mritunjay

    2016-01-01

    Background and Aims: India is a vast country with variable, nonuniform healthcare practices. A laryngoscope is an important tool during general anesthesia and resuscitation. The study aimed to determine the current practices of laryngoscope decontamination in India. Material and Methods: An online survey was conducted amongst 100 anesthesiologists to determine the common methods of laryngoscope decontamination adopted in their settings. The survey was done over 6 months after validating the questionnaire. Results: A total of 73 responses were received out of 100. The result of the survey revealed that there is no uniform technique of laryngoscope decontamination. There is marked variability in techniques followed not only among different institutions, but also within the same institution. Conclusion: There are no fixed protocols adopted for laryngoscope decontamination. Thus, there is a need to develop definitive guidelines on this subject, which can be implemented in India. PMID:27006551

  11. Compact Neutral Hydrogen Clouds: Searching for Undiscovered Dwarf Galaxies and Gas Associated with an Algol-type Variable Star

    NASA Astrophysics Data System (ADS)

    Grcevich, Jana; Berger, Sabrina; Putman, Mary E.; Eli Goldston Peek, Joshua

    2016-01-01

    Several interesting compact neutral hydrogen clouds were found in the GALFA-HI (Galactic Arecibo L-Band Feed Array HI) survey which may represent undiscovered dwarf galaxy candidates. The continuation of this search is motivated by successful discoveries of Local Volume dwarfs in the GALFA-HI DR1. We identify additional potential dwarf galaxies from the GALFA-HI DR1 Compact Cloud Catalog which are indentified as having unexpected velocities given their other characteristics via the bayesian analysis software BayesDB. We also present preliminary results of a by-eye search for dwarf galaxies in the GALFA-HI DR2, which provides additional sky coverage. Interestingly, one particularly compact cloud discovered during our dwarf galaxy search is spatially coincident with an Algol-type variable star. Although the association is tentative, Algol-type variables are thought to have undergone significant gas loss and it is possible this gas may be observable in HI.

  12. A Survey of Shape Parameterization Techniques

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper provides a survey of shape parameterization techniques for multidisciplinary optimization and highlights some emerging ideas. The survey focuses on the suitability of available techniques for complex configurations, with suitability criteria based on the efficiency, effectiveness, ease of implementation, and availability of analytical sensitivities for geometry and grids. The paper also contains a section on field grid regeneration, grid deformation, and sensitivity analysis techniques.

  13. Survey of Biochemical Separation Techniques

    ERIC Educational Resources Information Center

    Nilsson, Melanie R.

    2007-01-01

    A simple laboratory exercise is illustrated that exposes students to wide range of separation techniques in one laboratory program and provides a nice complement to a project-oriented program. Students have learned the basic principles of syringe filtration, centricon, dialysis, gel filtration and solid-phase extraction methodologies and have got…

  14. Survey of Header Compression Techniques

    NASA Technical Reports Server (NTRS)

    Ishac, Joseph

    2001-01-01

    This report provides a summary of several different header compression techniques. The different techniques included are: (1) Van Jacobson's header compression (RFC 1144); (2) SCPS (Space Communications Protocol Standards) header compression (SCPS-TP, SCPS-NP); (3) Robust header compression (ROHC); and (4) The header compression techniques in RFC2507 and RFC2508. The methodology for compression and error correction for these schemes are described in the remainder of this document. All of the header compression schemes support compression over simplex links, provided that the end receiver has some means of sending data back to the sender. However, if that return path does not exist, then neither Van Jacobson's nor SCPS can be used, since both rely on TCP (Transmission Control Protocol). In addition, under link conditions of low delay and low error, all of the schemes perform as expected. However, based on the methodology of the schemes, each scheme is likely to behave differently as conditions degrade. Van Jacobson's header compression relies heavily on the TCP retransmission timer and would suffer an increase in loss propagation should the link possess a high delay and/or bit error rate (BER). The SCPS header compression scheme protects against high delay environments by avoiding delta encoding between packets. Thus, loss propagation is avoided. However, SCPS is still affected by an increased BER (bit-error-rate) since the lack of delta encoding results in larger header sizes. Next, the schemes found in RFC2507 and RFC2508 perform well for non-TCP connections in poor conditions. RFC2507 performance with TCP connections is improved by various techniques over Van Jacobson's, but still suffers a performance hit with poor link properties. Also, RFC2507 offers the ability to send TCP data without delta encoding, similar to what SCPS offers. ROHC is similar to the previous two schemes, but adds additional CRCs (cyclic redundancy check) into headers and improves

  15. Survey of data compression techniques

    SciTech Connect

    Gryder, R.; Hake, K.

    1991-09-01

    PM-AIM must provide to customers in a timely fashion information about Army acquisitions. This paper discusses ways that PM-AIM can reduce the volume of data that must be transmitted between sites. Although this paper primarily discusses techniques of data compression, it also briefly discusses other options for meeting the PM-AIM requirements. The options available to PM-AIM, in addition to hardware and software data compression, include less-frequent updates, distribution of partial updates, distributed data base design, and intelligent network design. Any option that enhances the performance of the PM-AIM network is worthy of consideration. The recommendations of this paper apply to the PM-AIM project in three phases: the current phase, the target phase, and the objective phase. Each recommendation will be identified as (1) appropriate for the current phase, (2) considered for implementation during the target phase, or (3) a feature that should be part of the objective phase of PM-AIM`s design. The current phase includes only those measures that can be taken with the installed leased lines. The target phase includes those measures that can be taken in transferring the traffic from the leased lines to the DSNET environment with minimal changes in the current design. The objective phase includes all the things that should be done as a matter of course. The objective phase for PM-AIM appears to be a distributed data base with data for each site stored locally and all sites having access to all data.

  16. Survey of data compression techniques

    SciTech Connect

    Gryder, R.; Hake, K.

    1991-09-01

    PM-AIM must provide to customers in a timely fashion information about Army acquisitions. This paper discusses ways that PM-AIM can reduce the volume of data that must be transmitted between sites. Although this paper primarily discusses techniques of data compression, it also briefly discusses other options for meeting the PM-AIM requirements. The options available to PM-AIM, in addition to hardware and software data compression, include less-frequent updates, distribution of partial updates, distributed data base design, and intelligent network design. Any option that enhances the performance of the PM-AIM network is worthy of consideration. The recommendations of this paper apply to the PM-AIM project in three phases: the current phase, the target phase, and the objective phase. Each recommendation will be identified as (1) appropriate for the current phase, (2) considered for implementation during the target phase, or (3) a feature that should be part of the objective phase of PM-AIM's design. The current phase includes only those measures that can be taken with the installed leased lines. The target phase includes those measures that can be taken in transferring the traffic from the leased lines to the DSNET environment with minimal changes in the current design. The objective phase includes all the things that should be done as a matter of course. The objective phase for PM-AIM appears to be a distributed data base with data for each site stored locally and all sites having access to all data.

  17. Monitoring beach changes using GPS surveying techniques

    USGS Publications Warehouse

    Morton, Robert; Leach, Mark P.; Paine, Jeffrey G.; Cardoza, Michael A.

    1993-01-01

    The adaptation of Global Positioning System (GPS) surveying techniques to beach monitoring activities is a promising response to this challenge. An experiment that employed both GPS and conventional beach surveying was conducted, and a new beach monitoring method employing kinematic GPS surveys was devised. This new method involves the collection of precise shore-parallel and shore-normal GPS positions from a moving vehicle so that an accurate two-dimensional beach surface can be generated. Results show that the GPS measurements agree with conventional shore-normal surveys at the 1 cm level, and repeated GPS measurements employing the moving vehicle demonstrate a precision of better than 1 cm. In addition, the nearly continuous sampling and increased resolution provided by the GPS surveying technique reveals alongshore changes in beach morphology that are undetected by conventional shore-normal profiles. The application of GPS surveying techniques combined with the refinement of appropriate methods for data collection and analysis provides a better understanding of beach changes, sediment transport, and storm impacts.

  18. A Survey of Techniques for Approximate Computing

    DOE PAGESBeta

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  19. Recommendations for abortion surveys using the ballot-box technique.

    PubMed

    Medeiros, Marcelo; Diniz, Debora

    2012-07-01

    The article lists recommendations for dealing with methodological aspects of an abortion survey and makes suggestions for testing and validating the survey questionnaire. The recommendations are based on the experience of the Brazilian Abortion Survey (PNA), a random sample household survey that used the ballot-box technique and covered adult women in all urban areas of the country. PMID:22872333

  20. Survey of Radiographic Requirements and Techniques.

    ERIC Educational Resources Information Center

    Farman, Allan G.; Shawkat, Abdul H.

    1981-01-01

    A survey of dental schools revealed little standardization of student requirements for dental radiography in the United States. There was a high degree of variability as to what constituted a full radiographic survey, which has implications concerning the maximum limits to patient exposure to radiation. (Author/MLW)

  1. A survey of techniques for corrosion monitoring

    SciTech Connect

    Mickalonis, J.I.

    1992-10-01

    Corrosion monitoring techniques have improved with advances in instrumentation technology and corrosion research. Older techniques, such as coupon immersion, generally provide historical information. The new electrochemical techniques, which have not been used widely at SRS, allow on-line monitoring and correlation with process changes. These techniques could improve the corrosion assessment of the waste tanks to be used for In-Tank Precipitation and Extended Sludge Processing. A task was initiated to place an electrochemical probe into tank 48 for testing the utility of this technique for waste tank applications.

  2. A Catalog of Ultra-compact High Velocity Clouds from the ALFALFA Survey: Local Group Galaxy Candidates?

    NASA Astrophysics Data System (ADS)

    Adams, Elizabeth A. K.; Giovanelli, Riccardo; Haynes, Martha P.

    2013-05-01

    We present a catalog of 59 ultra-compact high velocity clouds (UCHVCs) extracted from the 40% complete ALFALFA HI-line survey. The ALFALFA UCHVCs have median flux densities of 1.34 Jy km s-1, median angular diameters of 10', and median velocity widths of 23 km s-1. We show that the full UCHVC population cannot easily be associated with known populations of high velocity clouds. Of the 59 clouds presented here, only 11 are also present in the compact cloud catalog extracted from the commensal GALFA-HI survey, demonstrating the utility of this separate dataset and analysis. Based on their sky distribution and observed properties, we infer that the ALFALFA UCHVCs are consistent with the hypothesis that they may be very low mass galaxies within the Local Volume. In that case, most of their baryons would be in the form of gas, and because of their low stellar content, they remain unidentified by extant optical surveys. At distances of ~1 Mpc, the UCHVCs have neutral hydrogen (H I) masses of ~105-106 M ⊙, H I diameters of ~2-3 kpc, and indicative dynamical masses within the H I extent of ~107-108 M ⊙, similar to the Local Group ultra-faint dwarf Leo T. The recent ALFALFA discovery of the star-forming, metal-poor, low mass galaxy Leo P demonstrates that this hypothesis is true in at least one case. In the case of the individual UCHVCs presented here, confirmation of their extragalactic nature will require further work, such as the identification of an optical counterpart to constrain their distance.

  3. A CATALOG OF ULTRA-COMPACT HIGH VELOCITY CLOUDS FROM THE ALFALFA SURVEY: LOCAL GROUP GALAXY CANDIDATES?

    SciTech Connect

    Adams, Elizabeth A. K.; Giovanelli, Riccardo; Haynes, Martha P. E-mail: riccardo@astro.cornell.edu

    2013-05-01

    We present a catalog of 59 ultra-compact high velocity clouds (UCHVCs) extracted from the 40% complete ALFALFA HI-line survey. The ALFALFA UCHVCs have median flux densities of 1.34 Jy km s{sup -1}, median angular diameters of 10', and median velocity widths of 23 km s{sup -1}. We show that the full UCHVC population cannot easily be associated with known populations of high velocity clouds. Of the 59 clouds presented here, only 11 are also present in the compact cloud catalog extracted from the commensal GALFA-HI survey, demonstrating the utility of this separate dataset and analysis. Based on their sky distribution and observed properties, we infer that the ALFALFA UCHVCs are consistent with the hypothesis that they may be very low mass galaxies within the Local Volume. In that case, most of their baryons would be in the form of gas, and because of their low stellar content, they remain unidentified by extant optical surveys. At distances of {approx}1 Mpc, the UCHVCs have neutral hydrogen (H I) masses of {approx}10{sup 5}-10{sup 6} M{sub Sun }, H I diameters of {approx}2-3 kpc, and indicative dynamical masses within the H I extent of {approx}10{sup 7}-10{sup 8} M{sub Sun }, similar to the Local Group ultra-faint dwarf Leo T. The recent ALFALFA discovery of the star-forming, metal-poor, low mass galaxy Leo P demonstrates that this hypothesis is true in at least one case. In the case of the individual UCHVCs presented here, confirmation of their extragalactic nature will require further work, such as the identification of an optical counterpart to constrain their distance.

  4. Survey of immunoassay techniques for biological analysis

    SciTech Connect

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs.

  5. Survey of Software Assurance Techniques for Highly Reliable Systems

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy

    2004-01-01

    This document provides a survey of software assurance techniques for highly reliable systems including a discussion of relevant safety standards for various industries in the United States and Europe, as well as examples of methods used during software development projects. It contains one section for each industry surveyed: Aerospace, Defense, Nuclear Power, Medical Devices and Transportation. Each section provides an overview of applicable standards and examples of a mission or software development project, software assurance techniques used and reliability achieved.

  6. Survey of air cargo forecasting techniques

    NASA Technical Reports Server (NTRS)

    Kuhlthan, A. R.; Vermuri, R. S.

    1978-01-01

    Forecasting techniques currently in use in estimating or predicting the demand for air cargo in various markets are discussed with emphasis on the fundamentals of the different forecasting approaches. References to specific studies are cited when appropriate. The effectiveness of current methods is evaluated and several prospects for future activities or approaches are suggested. Appendices contain summary type analyses of about 50 specific publications on forecasting, and selected bibliographies on air cargo forecasting, air passenger demand forecasting, and general demand and modalsplit modeling.

  7. Survey Of High Speed Test Techniques

    NASA Astrophysics Data System (ADS)

    Gheewala, Tushar

    1988-02-01

    The emerging technologies for the characterization and production testing of high-speed devices and integrated circuits are reviewed. The continuing progress in the field of semiconductor technologies will, in the near future, demand test techniques to test 10ps to lOOps gate delays, 10 GHz to 100 GHz analog functions and 10,000 to 100,000 gates on a single chip. Clearly, no single test technique would provide a cost-effective answer to all the above demands. A divide-and-conquer approach based on a judicial selection of parametric, functional and high-speed tests will be required. In addition, design-for-test methods need to be pursued which will include on-chip test electronics as well as circuit techniques that minimize the circuit performance sensitivity to allowable process variations. The electron and laser beam based test technologies look very promising and may provide the much needed solutions to not only the high-speed test problem but also to the need for high levels of fault coverage during functional testing.

  8. Superresolution imaging: a survey of current techniques

    NASA Astrophysics Data System (ADS)

    Cristóbal, G.; Gil, E.; Šroubek, F.; Flusser, J.; Miravet, C.; Rodríguez, F. B.

    2008-08-01

    Imaging plays a key role in many diverse areas of application, such as astronomy, remote sensing, microscopy, and tomography. Owing to imperfections of measuring devices (e.g., optical degradations, limited size of sensors) and instability of the observed scene (e.g., object motion, media turbulence), acquired images can be indistinct, noisy, and may exhibit insuffcient spatial and temporal resolution. In particular, several external effects blur images. Techniques for recovering the original image include blind deconvolution (to remove blur) and superresolution (SR). The stability of these methods depends on having more than one image of the same frame. Differences between images are necessary to provide new information, but they can be almost unperceivable. State-of-the-art SR techniques achieve remarkable results in resolution enhancement by estimating the subpixel shifts between images, but they lack any apparatus for calculating the blurs. In this paper, after introducing a review of current SR techniques we describe two recently developed SR methods by the authors. First, we introduce a variational method that minimizes a regularized energy function with respect to the high resolution image and blurs. In this way we establish a unifying way to simultaneously estimate the blurs and the high resolution image. By estimating blurs we automatically estimate shifts with subpixel accuracy, which is inherent for good SR performance. Second, an innovative learning-based algorithm using a neural architecture for SR is described. Comparative experiments on real data illustrate the robustness and utilization of both methods.

  9. A survey of data mining techniques

    NASA Astrophysics Data System (ADS)

    Jorgensen, A. M.; Karimabadi, H.

    2005-12-01

    Data mining is the act of extracting useful knowledge from a data set. This knowledge can take many forms. It can be in the detection of features of interest. It can be in the form of statistical analysis of data sets. It can be in the form of predictive expressions describing the data or relationships between data. It can be in the form of anomaly detection, serendipitous events which require further investigation. The simplest form of data mining consists of manual inspection of the data sets. This has been the norm in space physics for decades. The most advanced forms of data mining make use of emerging computer science methods, including advances in artificial intelligence. In this presentation we will give a brief introduction to some of the resources that are available for performing advanced data mining. We will focus on giving an overview of some data mining techniques, and how they have been, or could be, applied to space physics problems.

  10. A Survey of Architectural Techniques For Improving Cache Power Efficiency

    SciTech Connect

    Mittal, Sparsh

    2013-01-01

    Modern processors are using increasingly larger sized on-chip caches. Also, with each CMOS technology generation, there has been a significant increase in their leakage energy consumption. For this reason, cache power management has become a crucial research issue in modern processor design. To address this challenge and also meet the goals of sustainable computing, researchers have proposed several techniques for improving energy efficiency of cache architectures. This paper surveys recent architectural techniques for improving cache power efficiency and also presents a classification of these techniques based on their characteristics. For providing an application perspective, this paper also reviews several real-world processor chips that employ cache energy saving techniques. The aim of this survey is to enable engineers and researchers to get insights into the techniques for improving cache power efficiency and motivate them to invent novel solutions for enabling low-power operation of caches.

  11. Pattern recognition techniques in microarray data analysis: a survey.

    PubMed

    Valafar, Faramarz

    2002-12-01

    Recent development of technologies (e.g., microarray technology) that are capable of producing massive amounts of genetic data has highlighted the need for new pattern recognition techniques that can mine and discover biologically meaningful knowledge in large data sets. Many researchers have begun an endeavor in this direction to devise such data-mining techniques. As such, there is a need for survey articles that periodically review and summarize the work that has been done in the area. This article presents one such survey. The first portion of the paper is meant to provide the basic biology (mostly for non-biologists) that is required in such a project. This part is only meant to be a starting point for those experts in the technical fields who wish to embark on this new area of bioinformatics. The second portion of the paper is a survey of various data-mining techniques that have been used in mining microarray data for biological knowledge and information (such as sequence information). This survey is not meant to be treated as complete in any form, since the area is currently one of the most active, and the body of research is very large. Furthermore, the applications of the techniques mentioned here are not meant to be taken as the most significant applications of the techniques, but simply as examples among many. PMID:12594081

  12. Nondestructive Technique Survey for Assessing Integrity of Composite Firing Vessel

    SciTech Connect

    Tran, A.

    2000-08-01

    The repeated use and limited lifetime of a composite tiring vessel compel a need to survey techniques for monitoring the structural integrity of the vessel in order to determine when it should be retired. Various nondestructive techniques were researched and evaluated based on their applicability to the vessel. The methods were visual inspection, liquid penetrant testing, magnetic particle testing, surface mounted strain gauges, thermal inspection, acoustic emission, ultrasonic testing, radiography, eddy current testing, and embedded fiber optic sensors. It was determined that embedded fiber optic sensor is the most promising technique due to their ability to be embedded within layers of composites and their immunity to electromagnetic interference.

  13. A survey of visual preprocessing and shape representation techniques

    NASA Technical Reports Server (NTRS)

    Olshausen, Bruno A.

    1988-01-01

    Many recent theories and methods proposed for visual preprocessing and shape representation are summarized. The survey brings together research from the fields of biology, psychology, computer science, electrical engineering, and most recently, neural networks. It was motivated by the need to preprocess images for a sparse distributed memory (SDM), but the techniques presented may also prove useful for applying other associative memories to visual pattern recognition. The material of this survey is divided into three sections: an overview of biological visual processing; methods of preprocessing (extracting parts of shape, texture, motion, and depth); and shape representation and recognition (form invariance, primitives and structural descriptions, and theories of attention).

  14. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies.

    PubMed

    Caldwell, Zachary R; Zgliczynski, Brian J; Williams, Gareth J; Sandin, Stuart A

    2016-01-01

    Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods-belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal) and region of study, but was related to the researcher's home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%), their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data. PMID:27111085

  15. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies

    PubMed Central

    Caldwell, Zachary R.; Zgliczynski, Brian J.; Williams, Gareth J.; Sandin, Stuart A.

    2016-01-01

    Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods–belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal) and region of study, but was related to the researcher’s home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%), their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data. PMID:27111085

  16. Integrated Surveying Techniques for Sensitive Areas: San Felice Sul Panaro

    NASA Astrophysics Data System (ADS)

    Ballarin, M.; Buttolo, V.; Guerra, F.; Vernier, P.

    2013-07-01

    The last few years have marked an exponential growth in the use of electronic and computing technologies that opened new possibilities and new scenarios in the Geomatic field. This evolution of tools and methods has led to new ways of approaching survey. For what concerns architecture, the new tools for survey acquisition and 3D modelling allow the representation of an object through a digital model, combining the visual potentials of images, normally used for documentation, with the precision of a metric survey. This research focuses on the application of these new technologies and methodologies on sensitive areas, such as portions of the cities affected by earthquakes. In this field the survey is intended to provide a useful support for other structural analysis, in conservation as well as for restoration studies. However, survey in architecture is still a very complex operation both from a methodological and a practical point of view: it requires a critical interpretation of the artefacts and a deep knowledge of the existing techniques and technologies, which often are very different but need to be integrated within a single general framework. This paper describes the first results of the survey conducted on the church of San Geminiano in San Felice sul Panaro (Modena). Here, different tools and methods were used, in order to create a new system that integrates the most recent and cutting-edge technologies in the Geomatic field. The methodologies used were laser scanning, UAV photogrammetry and topography for the definition of the reference system. The present work will focus on the data acquisition and processing whit these techniques and their integration.

  17. The Importance of Local Surveys for Tying Techniques Together

    NASA Technical Reports Server (NTRS)

    Long, James L.; Bosworth, John M.

    2000-01-01

    The synergistic benefits of combining observations from multiple space geodesy techniques located at a site are a main reason behind the proposal for the establishment of the International Space Geodetic and Gravimetric Network (ISGN). However, the full benefits of inter-comparison are only realized when the spatial relationships between the different space geodetic systems are accurately determined. These spatial relationships are best determined and documented by developing a local reference network of stable ground monuments and conducting periodic surveys to tie together the reference points (for example: the intersection of rotation axes of a VLBI antenna) of the space geodetic systems and the ground monument network. The data obtained from local surveys is vital to helping understand any systematic errors within an individual technique and to helping identify any local movement or deformation of the space geodetic systems over time.

  18. Surveying co-located space geodesy techniques for ITRF computation

    NASA Astrophysics Data System (ADS)

    Sarti, P.; Sillard, P.; Vittuari, L.

    2003-04-01

    We present a comprehensive operational methodology, based on classical geodesy triangulation and trilateration, that allows the determination of reference points of the five space geodesy techniques used in ITRF computation (i.e.: DORIS, GPS, LLR, SLR, VLBI). Most of the times, for a single technique, the reference point is not accessible and measurable directly. Likewise, no mechanically determined ex-center with respect to an external and measurable point is usually given. In these cases, it is not possible to directly measure the sought reference points and it is even less straightforward to obtain the statistical information relating these points for different techniques. We outline the most general practical surveying methodology that permits to recover the reference points of the different techniques regardless of their physical materialization. We also give a detailed analytical approach for less straightforward cases (e.g.: non geodetic VLBI antennae and SLR/LLR systems). We stress the importance of surveying instrumentation and procedure in achieving the best possible results and outline the impact of the information retrieved with our method in ITRF computation. In particular, we will give numerical examples of computation of the reference point of VLBI antennae (Ny Aalesund and Medicina) and the ex-centre vector computation linking co-located VLBI and GPS techniques in Medicina (Italy). A special attention was paid to the rigorous derivation of statistical elements. They will be presented in an other presentation.

  19. Time Delay Integration: A Wide-Field Survey Technique

    NASA Astrophysics Data System (ADS)

    Lapointe, Robert; Hill, E.; Leimer, L.; McMillian, K.; Miller, A.; Prindle, A.

    2009-05-01

    The Advanced Placement Physics class of Orange Lutheran High School has conducted a survey-imaging pro-ject using a Time Delay Integration (TDI) technique. TDI enables very wide-field images to be collected in the form of long strips of the sky. A series of five consecutive nights were captured, calibrated and compared to re-veal possible transient phenomena such as supernovae, asteroids, and other events that have a noticeable change over 24-hour intervals.

  20. Survey of intravitreal injection techniques among retina specialists in Israel

    PubMed Central

    Segal, Ori; Segal-Trivitz, Yael; Nemet, Arie Y; Geffen, Noa; Nesher, Ronit; Mimouni, Michael

    2016-01-01

    Purpose The purpose of this study was to describe antivascular endothelial growth factor intravitreal injection techniques of retinal specialists in order to establish a cornerstone for future practice guidelines. Methods All members of the Israeli Retina Society were contacted by email to complete an anonymous, 19-question, Internet-based survey regarding their intravitreal injection techniques. Results Overall, 66% (52/79) completed the survey. Most (98%) do not instruct patients to discontinue anticoagulant therapy and 92% prescribe treatment for patients in the waiting room. Three quarters wear sterile gloves and prepare the patient in the supine position. A majority (71%) use sterile surgical draping. All respondents apply topical analgesics and a majority (69%) measure the distance from the limbus to the injection site. A minority (21%) displace the conjunctiva prior to injection. A majority of the survey participants use a 30-gauge needle and the most common quadrant for injection is superotemporal (33%). Less than half routinely assess postinjection optic nerve perfusion (44%). A majority (92%) apply prophylactic antibiotics immediately after the injection. Conclusion The majority of retina specialists perform intravitreal injections similarly. However, a relatively large minority performs this procedure differently. Due to the extremely low percentage of complications, it seems as though such differences do not increase the risk. However, more evidence-based medicine, a cornerstone for practice guidelines, is required in order to identify the intravitreal injection techniques that combine safety and efficacy while causing as little discomfort to the patients as possible. PMID:27366050

  1. Digital Survey Techniques for the Documentation of Wooden Shipwrecks

    NASA Astrophysics Data System (ADS)

    Costa, E.; Balletti, C.; Beltrame, C.; Guerra, F.; Vernier, P.

    2016-06-01

    Nowadays, researchers widely employ the acquisition of point clouds as one of the principal type of documentation for cultural heritage. In this paper, different digital survey techniques are employed to document a wooden ancient shipwreck, a particular and difficult kind of archaeological finding due to its material characteristics. The instability of wood and the high costs of restoration do not always offer the opportunity of recovering and showing the hull to researchers and public and three-dimensional surveys are fundamental to document the original conditions of the wood. The precarious conditions of this material in contact with air could modify the structure and the size of the boat, requiring a fast and accurate recording technique. The collaboration between Ca' Foscari University and the Laboratory of Photogrammetry of Iuav University of Venice has given the possibility to demonstrate the utility of these technology. We have surveyed a sewn boat of Roman age through multi-image photogrammetry and laser scanner. Point clouds were compared and a residual analysis was done, to verify the characteristics and the opportunity of the two techniques, both of them have allowed obtaining a very precise documentation from a metrical point of view.

  2. Power Management Techniques for Data Centers: A Survey

    SciTech Connect

    Mittal, Sparsh

    2014-07-01

    With growing use of internet and exponential growth in amount of data to be stored and processed (known as ``big data''), the size of data centers has greatly increased. This, however, has resulted in significant increase in the power consumption of the data centers. For this reason, managing power consumption of data centers has become essential. In this paper, we highlight the need of achieving energy efficiency in data centers and survey several recent architectural techniques designed for power management of data centers. We also present a classification of these techniques based on their characteristics. This paper aims to provide insights into the techniques for improving energy efficiency of data centers and encourage the designers to invent novel solutions for managing the large power dissipation of data centers.

  3. A Survey of Architectural Techniques for Near-Threshold Computing

    DOE PAGESBeta

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlightmore » their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.« less

  4. A Survey of Architectural Techniques for Near-Threshold Computing

    SciTech Connect

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlight their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.

  5. Survey of Product-line Verification and Validation Techniques

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn

    2007-01-01

    This report presents the results from the first task of the SARP Center Initiative, 'Product Line Verification of Safety-Critical Software.' Task 1 is a literature survey of available techniques for product line verification and validation. Section 1 of the report provides an introduction to product lines and motivates the survey of verification techniques. It describes what is reused in product-line engineering and explains the goal of verifiable conformance of the developed system to its product-line specifications. Section 2 of the report describes six lifecycle steps in product-line verification and validation. This description is based on, and refers to, the best practices extracted from the readings. It ends with a list of verification challenges for NASA product lines (2.7) and verification enablers for NASA product lines (2.8) derived from the survey. Section 3 provides resource lists of related conferences, workshops, industrial and defense industry experiences and case studies of product lines, and academic/industrial consortiums. Section 4 is a bibliography of papers and tutorials with annotated entries for relevant papers not previously discussed in sections 2 or 3.

  6. Literature survey of heat transfer enhancement techniques in refrigeration applications

    SciTech Connect

    Jensen, M.K.; Shome, B.

    1994-05-01

    A survey has been performed of the technical and patent literature on enhanced heat transfer of refrigerants in pool boiling, forced convection evaporation, and condensation. Extensive bibliographies of the technical literature and patents are given. Many passive and active techniques were examined for pure refrigerants, refrigerant-oil mixtures, and refrigerant mixtures. The citations were categorized according to enhancement technique, heat transfer mode, and tube or shell side focus. The effects of the enhancement techniques relative to smooth and/or pure refrigerants were illustrated through the discussion of selected papers. Patented enhancement techniques also are discussed. Enhanced heat transfer has demonstrated significant improvements in performance in many refrigerant applications. However, refrigerant mixtures and refrigerant-oil mixtures have not been studied extensively; no research has been performed with enhanced refrigerant mixtures with oil. Most studies have been of the parametric type; there has been inadequate examination of the fundamental processes governing enhanced refrigerant heat transfer, but some modeling is being done and correlations developed. It is clear that an enhancement technique must be optimized for the refrigerant and operating condition. Fundamental processes governing the heat transfer must be examined if models for enhancement techniques are to be developed; these models could provide the method to optimize a surface. Refrigerant mixtures, with and without oil present, must be studied with enhancement devices; there is too little known to be able to estimate the effects of mixtures (particularly NARMs) with enhanced heat transfer. Other conclusions and recommendations are offered.

  7. Surveying converter lining erosion state based on laser measurement technique

    NASA Astrophysics Data System (ADS)

    Li, Hongsheng; Shi, Tielin; Yang, Shuzi

    1998-08-01

    It is very important to survey the eroding state of the steelmaking converter lining real time so as to optimize technological process, extend converter durability and reduce steelmaking production costs. This paper gives one practical method based on the laser measure technique. It presents the basic principle of the measure technique. It presents the basic principle of the measure method, the composition of the measure system and the researches on key technological problems. The method is based on the technique of the laser range finding to net points on the surface of the surveyed converter lining, and the technology of angle finding to the laser beams. The angle signals would be used to help realizing the automatic scanning function also. The laser signals would be modulated and encoded. In the meantime, we would adopt the wavelet analysis and other filter algorithms, to denoise noisy data and extract useful information. And the main idea of some algorithms such as the net point measuring path planning and the measure device position optimal algorithm would also be given in order to improve the measure precision and real time property of the system.

  8. Using machine learning techniques to automate sky survey catalog generation

    NASA Technical Reports Server (NTRS)

    Fayyad, Usama M.; Roden, J. C.; Doyle, R. J.; Weir, Nicholas; Djorgovski, S. G.

    1993-01-01

    We describe the application of machine classification techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Palomar Observatory Sky Survey provides comprehensive photographic coverage of the northern celestial hemisphere. The photographic plates are being digitized into images containing on the order of 10(exp 7) galaxies and 10(exp 8) stars. Since the size of this data set precludes manual analysis and classification of objects, our approach is to develop a software system which integrates independently developed techniques for image processing and data classification. Image processing routines are applied to identify and measure features of sky objects. Selected features are used to determine the classification of each object. GID3* and O-BTree, two inductive learning techniques, are used to automatically learn classification decision trees from examples. We describe the techniques used, the details of our specific application, and the initial encouraging results which indicate that our approach is well-suited to the problem. The benefits of the approach are increased data reduction throughput, consistency of classification, and the automated derivation of classification rules that will form an objective, examinable basis for classifying sky objects. Furthermore, astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems given automatically cataloged data.

  9. The History of Electromagnetic Induction Techniques in Soil Survey

    NASA Astrophysics Data System (ADS)

    Brevik, Eric C.; Doolittle, Jim

    2014-05-01

    Electromagnetic induction (EMI) has been used to characterize the spatial variability of soil properties since the late 1970s. Initially used to assess soil salinity, the use of EMI in soil studies has expanded to include: mapping soil types; characterizing soil water content and flow patterns; assessing variations in soil texture, compaction, organic matter content, and pH; and determining the depth to subsurface horizons, stratigraphic layers or bedrock, among other uses. In all cases the soil property being investigated must influence soil apparent electrical conductivity (ECa) either directly or indirectly for EMI techniques to be effective. An increasing number and diversity of EMI sensors have been developed in response to users' needs and the availability of allied technologies, which have greatly improved the functionality of these tools. EMI investigations provide several benefits for soil studies. The large amount of georeferenced data that can be rapidly and inexpensively collected with EMI provides more complete characterization of the spatial variations in soil properties than traditional sampling techniques. In addition, compared to traditional soil survey methods, EMI can more effectively characterize diffuse soil boundaries and identify included areas of dissimilar soils within mapped soil units, giving soil scientists greater confidence when collecting spatial soil information. EMI techniques do have limitations; results are site-specific and can vary depending on the complex interactions among multiple and variable soil properties. Despite this, EMI techniques are increasingly being used to investigate the spatial variability of soil properties at field and landscape scales.

  10. Integration of Geomatic Techniques for the Urban Cavity Survey

    NASA Astrophysics Data System (ADS)

    Deidda, M.; Sanna, G.

    2013-07-01

    Cagliari, county seat of Sardinia Region (Italy), situated in the southern part of the island, is characterized by a subsoil full of cavities. The excavations in fact, which lasted more than 4000 years, had a great development due also to the special geological characteristics of the city subsoil. The underground voids, which the city is rich in, belong to different classes such as hydraulic structures (aqueducts, cisterns, wells, etc.), settlement works (tunnels, bomb shelters, tombs etc.) and various works (quarries, natural caves, etc.). This paper describes the phases of the survey of a large cavity below a high-traffic square near the Faculty of Engineering in the city of Cagliari, where the research team works. The cave, which is part of a larger complex, is important because it was used in the thirteenth century (known as the Pisan age) as a stone quarry. There are traces of this activity that have to be protected. Moreover, during the last forty years the continuous crossover of vehicles cracked the roof of the cave compromising the stability of the entire area. Consequently a plan was developed to make the whole cavity safe and usable for visits. The study of the safety of the cave has involved different professionals among which geologists, engineers, constructors. The goal of the University of Cagliari geomatic team was to solve two problems: to obtain geometrical information about the void and correctly place the cave in the context of existing maps. The survey and the products, useful for the investigation of the technicians involved, had to comply with tolerances of 3 cm in the horizontal and 5 cm in the vertical component. The approach chosen for this purpose was to integrate different geomatic techniques. The cave was surveyed using a laser scanner (Faro Photon 80) in order to obtain a 3D model of the cave from which all the geometrical information was derived, while both classic topography and GPS techniques were used to include the cave in the

  11. A survey of CPU-GPU heterogeneous computing techniques

    DOE PAGESBeta

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less

  12. A survey of CPU-GPU heterogeneous computing techniques

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and application level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.

  13. AIRBORNE INERTIAL SURVEYING USING LASER TRACKING AND PROFILING TECHNIQUES.

    USGS Publications Warehouse

    Cyran, Edward J.

    1986-01-01

    The U. S. Geological Survey through a contract with the Charles Stark Draper Laboratory has developed the Aerial Profiling of Terrain System. This is an airborne inertial surveying system designed to use a laser tracker to provide position and velocity updates, and a laser profiler to measure terrain elevations. The performance characteristics of the system are discussed with emphasis placed on the performance of the laser devices. The results of testing the system are summarized for both performance evaluation and applications.

  14. Pre-survey feasibility assessment of the persistent scatterer technique

    NASA Astrophysics Data System (ADS)

    Plank, Simon; Singer, John; Thuro, Kurosch

    2013-04-01

    The remote sensing technique persistent scatterer synthetic aperture radar interferometry (PS-InSAR) is a powerful method for detection and monitoring of landslides with accuracy up to a few millimeters. However, precondition for reliable PS-InSAR processing is a stack of at least 15 to 50 SAR images. This makes processing very time-consuming and expensive. Furthermore, successful PS-InSAR application requires a high number of measurement points within the area of interest - so-called persistent scatterers (PS) which are scatterers of high coherent values. But estimation of the number and the distribution of the PS within the site prior to the recording and processing of several SAR images is very complicated. Therefore, we developed three new methods for PS estimation prior to the acquisition of the SAR data. These methods are based on freely available or low-cost optical remote sensing data, land cover data (e.g. GlobCover and CORINE) as well as topographic maps and OpenStreetMap data. By means of empirical approaches these geodata were compared with results of real PS-InSAR processing of several sites. First, the well-known normalized difference vegetation index (NDVI) processed with optical remote sensing data was used in an entirely new approach to estimate PS prior to the SAR data acquisition of the area of interest. Result of this method is an estimation of the probability for each pixel of the NDVI image to get a PS at a certain NDVI value. When using freely available middle spatial resolution optical data (e.g. Landsat and ASTER) this PS estimation procedure works very well in areas of sparse vegetation. World-wide application of this method requires high spatial resolution optical sensors. Then, the NDVI-based PS estimation method can also be applied at areas covered by denser vegetation. The second PS estimation method is based on freely available land cover datasets. Result of this method is an estimation of the PS density (PS/km²) for each type of

  15. Survey of techniques used to preserve biological materials

    NASA Technical Reports Server (NTRS)

    Feinler, E. J.; Hubbard, R. W.

    1972-01-01

    The techniques used to preserve biological materials are documented and summarized. The report is presented in a handbook format that categorizes the most important preservation techniques available, and includes a representative sampling of the thousands of applications of these techniques to biological materials and organisms. Details of the information coverage and method of approach are outlined. Data are given in tabular form, and an index and extensive bibliography are included.

  16. Sampling for Telephone Surveys: Do the Results Depend on Technique?

    ERIC Educational Resources Information Center

    Franz, Jennifer D.

    Two basic methods exist for drawing probability samples to be used in telephone surveys: directory sampling (from alphabetical or street directories) and random digit dialing (RDD). RDD includes unlisted numbers, whereas directory sampling includes only listed numbers. The goal of this paper is to estimate the effect of failure to include…

  17. A Survey of Librarian Perceptions of Information Literacy Techniques

    ERIC Educational Resources Information Center

    Yearwood, Simone L.; Foasberg, Nancy M.; Rosenberg, Kenneth D.

    2015-01-01

    Teaching research competencies and information literacy is an integral part of the academic librarian's role. There has long been debate among librarians over what are the most effective methods of instruction for college students. Library Faculty members at a large urban university system were surveyed to determine their perceptions of the…

  18. Getting Parents and Students Involved: Using Survey and Interview Techniques.

    ERIC Educational Resources Information Center

    Downs, Judy R.

    1993-01-01

    Describes a series of class activities involving student surveys and interviews with parents and other adults. Discusses possible interview topics ranging from important inventions to simulated interviews with historical figures. Reports that student interest improved and parents became more involved with school activities. (CFR)

  19. A survey of GPU-based medical image computing techniques.

    PubMed

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  20. A survey of GPU-based medical image computing techniques

    PubMed Central

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming

    2012-01-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  1. USES OF MARKETING TECHNIQUES THE U. S. GEOLOGICAL SURVEY.

    USGS Publications Warehouse

    McDermott, Michael P.

    1983-01-01

    The use of marketing techniques by government agencies to provide more efficient and effective dissemination of their information is a fairly recent development. A recessive economy, and increased scrutiny of operations have become a powerful incentive to maximize revenues and minimize expenses wherever possible as long as the primary mission of public service is satisfactorily met.

  2. Survey of Temperature Measurement Techniques For Studying Underwater Shock Waves

    NASA Technical Reports Server (NTRS)

    Danehy, Paul M.; Alderfer, David W.

    2004-01-01

    Several optical methods for measuring temperature near underwater shock waves are reviewed and compared. The relative merits of the different techniques are compared, considering accuracy, precision, ease of use, applicable temperature range, maturity, spatial resolution, and whether or not special additives are required.

  3. Watermarking techniques used in medical images: a survey.

    PubMed

    Mousavi, Seyed Mojtaba; Naghsh, Alireza; Abu-Bakar, S A R

    2014-12-01

    The ever-growing numbers of medical digital images and the need to share them among specialists and hospitals for better and more accurate diagnosis require that patients' privacy be protected. As a result of this, there is a need for medical image watermarking (MIW). However, MIW needs to be performed with special care for two reasons. Firstly, the watermarking procedure cannot compromise the quality of the image. Secondly, confidential patient information embedded within the image should be flawlessly retrievable without risk of error after image decompressing. Despite extensive research undertaken in this area, there is still no method available to fulfill all the requirements of MIW. This paper aims to provide a useful survey on watermarking and offer a clear perspective for interested researchers by analyzing the strengths and weaknesses of different existing methods. PMID:24871349

  4. GPR as a Low Impact Paleontogical Survey Technique

    NASA Astrophysics Data System (ADS)

    Sturdevant, G. C.; Leverence, R.; Stewart, R.

    2013-12-01

    The Deweyville Formation, a Pleistocene fluvial sandstone, is a prolific source of megafaunal fossils from periods of low stand environmental conditions. GPR was employed in an environmentally sensitive area in close proximity to a salt dome in Northwest Harris County, Texas as a method of evaluating the probable paleo-depositional environment and to prospect for potential further site development of two distinct fossiliferous zones. The primary zone of interest is a lag gravel bounded sand responsible for producing a regionally unique fossil assemblage including South American megafauna (Lundelius et al, 2013). The secondary zone of interest contains undisturbed mammoth remains housed in coarse white sand emplaced on top of a clay drape which has been hypothesized to represent an oxbow lake formed by the meandering paleo-Brazos river. With an accurate map of the paleo-channel planning future activity can focus on maximizing fossil recovery and minimizing site impact. Pulse EKKO 250 MHz, 400MHz, and 1GHz system was employed in a prospect area proximal to the secondary site to calibrate and evaluate these systems for their resolution and penetration depth in the modern sediments. The data was processed using EKKO Mapper and EKKO View Deluxe software packages, 3d volumes were produced and sliced. Preliminary results from the 250 MHz demonstrate successful imaging of the sand-clay interface. After these surveys were run a small portion of the site was excavated to confirm the estimated velocities, the observed anomalies, and refine our modeling and interpretation, and improve grid design for further surveys. It was confirmed that the sand-clay interface was easily observable using GPR, however the grid spacing proved to be too wide, leading to artifacts in the 3d volume produced.

  5. A survey of reflectometry techniques with applications to TFTR

    SciTech Connect

    Collazo, I.; Stacey, W.M.; Wilgen, J.; Hanson, G.; Bigelow, T.; Thomas, C.E.; Bretz, N.

    1993-12-01

    This report presents a review of reflectometry with particular attention to eXtraordinary mode (X-mode) reflectometry using the novel technique of dual frequency differential phase. The advantage of using an X-mode wave is that it can probe the edge of the plasma with much higher resolution and using a much smaller frequency range than with the Ordinary mode (O-Mode). The general problem with previous full phase reflectometry techniques is that of keeping track of the phase (on the order of 1000 fringes) as the frequency is swept over the band. The dual frequency phase difference technique has the advantage that since it is keeping track of the phase difference of two frequencies with a constant frequency separation, the fringe counting is on the order of only 3 to 5 fringes. This fringe count, combined with the high resolution of the X-mode wave and the small plasma access requirements of reflectometry, make X-mode reflectometry a very attractive diagnostic for today`s experiments and future fusion devices.

  6. A Survey of Computational Intelligence Techniques in Protein Function Prediction

    PubMed Central

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction. PMID:25574395

  7. An analysis of oil and gas supply modeling techniques and a survey of offshore supply models

    SciTech Connect

    Walls, M.A.

    1990-01-01

    This report surveys the literature on empirical oil and gas supply modeling techniques. These techniques are categorized as either geologic/engineering, econometric, or hybrid - the last being a combination of geologic and econometric techniques. The geologic/ engineering models are further disaggregated into play analysis models and discovery process models. The strengths and weaknesses of each of the models are discussed. The report concludes with a discussion of how these techniques have been applied to offshore oil and gas supply.

  8. A survey of third-generation simulation techniques

    NASA Astrophysics Data System (ADS)

    Hachtel, G. D.; Sangiovanni-Vincentelli, A. L.

    1981-10-01

    A tutorial review is presented of 'third-generation' simulators and simulation techniques. It is attempted to provide a unified treatment of the various disparate simulator types based on the concept of decomposition of large-scale systems. The various 'third-generation' simulators are classified and described in terms of the role played by certain matrix forms in their formulation, taking into account the bordered block diagonal (bbd), the bordered block triangular (bbt), the bordered lower triangular (blt), the block diagonal (bd), the block triangular (bt), and the lower triangular (lt).

  9. Data indexing techniques for the EUVE all-sky survey

    NASA Technical Reports Server (NTRS)

    Lewis, J.; Saba, V.; Dobson, C.

    1992-01-01

    This poster describes techniques developed for manipulating large full-sky data sets for the Extreme Ultraviolet Explorer project. The authors have adapted the quatrilateralized cubic sphere indexing algorithm to allow us to efficiently store and process several types of large data sets, such as full-sky maps of photon counts, exposure time, and count rates. A variation of this scheme is used to index sparser data such as individual photon events and viewing times for selected areas of the sky, which are eventually used to create EUVE source catalogs.

  10. Survey of Natural Language Processing Techniques in Bioinformatics

    PubMed Central

    Zeng, Zhiqiang; Shi, Hua; Wu, Yun; Hong, Zhiling

    2015-01-01

    Informatics methods, such as text mining and natural language processing, are always involved in bioinformatics research. In this study, we discuss text mining and natural language processing methods in bioinformatics from two perspectives. First, we aim to search for knowledge on biology, retrieve references using text mining methods, and reconstruct databases. For example, protein-protein interactions and gene-disease relationship can be mined from PubMed. Then, we analyze the applications of text mining and natural language processing techniques in bioinformatics, including predicting protein structure and function, detecting noncoding RNA. Finally, numerous methods and applications, as well as their contributions to bioinformatics, are discussed for future use by text mining and natural language processing researchers. PMID:26525745

  11. Survey of Natural Language Processing Techniques in Bioinformatics.

    PubMed

    Zeng, Zhiqiang; Shi, Hua; Wu, Yun; Hong, Zhiling

    2015-01-01

    Informatics methods, such as text mining and natural language processing, are always involved in bioinformatics research. In this study, we discuss text mining and natural language processing methods in bioinformatics from two perspectives. First, we aim to search for knowledge on biology, retrieve references using text mining methods, and reconstruct databases. For example, protein-protein interactions and gene-disease relationship can be mined from PubMed. Then, we analyze the applications of text mining and natural language processing techniques in bioinformatics, including predicting protein structure and function, detecting noncoding RNA. Finally, numerous methods and applications, as well as their contributions to bioinformatics, are discussed for future use by text mining and natural language processing researchers. PMID:26525745

  12. Some fuzzy techniques for staff selection process: A survey

    NASA Astrophysics Data System (ADS)

    Md Saad, R.; Ahmad, M. Z.; Abu, M. S.; Jusoh, M. S.

    2013-04-01

    With high level of business competition, it is vital to have flexible staff that are able to adapt themselves with work circumstances. However, staff selection process is not an easy task to be solved, even when it is tackled in a simplified version containing only a single criterion and a homogeneous skill. When multiple criteria and various skills are involved, the problem becomes much more complicated. In adddition, there are some information that could not be measured precisely. This is patently obvious when dealing with opinions, thoughts, feelings, believes, etc. One possible tool to handle this issue is by using fuzzy set theory. Therefore, the objective of this paper is to review the existing fuzzy techniques for solving staff selection process. It classifies several existing research methods and identifies areas where there is a gap and need further research. Finally, this paper concludes by suggesting new ideas for future research based on the gaps identified.

  13. A survey of techniques for refrigeration, reliquefaction, and production of slush for hydrogen

    NASA Technical Reports Server (NTRS)

    Overcash, Dan R.

    1990-01-01

    Several techniques were surveyed for the refrigeration, reliquefaction and production of slush from hydrogen. The techniques included auger; bubbling helium gas; Simon desorption; the Petlier effect; Joule-Kelvin expansion using Stirling, Brayton, and Viulleumirer approaches; rotary reciprocating; a dilution refrigerator; adiabatic demagnetization of a paramagnetic salt; and adiabatic magnetization of a superconductor.

  14. Quantifying Stream Habitat: Relative Effort Versus Quality of Competing Remote Sensing & Ground-Based Survey Techniques

    NASA Astrophysics Data System (ADS)

    Bangen, S. G.; Wheaton, J. M.; Bouwes, N.

    2010-12-01

    Numerous field and analytical methods exist to assist in the quantification of the quantity and quality of in-stream habitat for salmonids. These methods range from field sketches or ‘tape and stick’ ground-based surveys, through to spatially explicit topographic and aerial photographic surveys from a mix of ground-based and remotely sensed airborne platforms. Although some investigators have assessed the quality of specific individual survey methods, the inter-comparison of competing techniques across a diverse range of habitat conditions (wadeable headwater channels to non-wadeable mainstem channels) has not yet been elucidated. In this study, we seek to quantify relative quality (i.e. accuracy, precision, extent) of habitat metrics and inventories derived from different ground-based and remotely sensed surveys of varying degrees of sophistication, as well as enumerate the effort and cost in completing the surveys. Over the summer of 2010, seven sample reaches of varying habitat complexity were surveyed in the Lemhi River Basin, Idaho, USA. Three different traditional (“stick and tape”) survey techniques were used, including a variant using map-grade GPS. Complete topographic/bathymetric surveys were attempted at each site using separate rtkGPS, total station, ground-based LiDaR, boat-based echo-sounding (w/ ADCP), traditional airborne LiDaR, and imagery-based spectral methods. Separate, georectified aerial imagery surveys were acquired using a tethered blimp, a drone UAV, and a traditional fixed-wing aircraft. Preliminary results from the surveys highlight that no single technique works across the full range of conditions where stream habitat surveys are needed. The results are helpful for understanding the strengths and weaknesses of each approach in specific conditions, and how a hybrid of data acquisition methods can be used to build a more complete quantification of habitat conditions in rivers.

  15. Coating integrity survey using DC voltage gradient technique at Korea Gas Corporation

    SciTech Connect

    Cho, Y.B.; Park, K.W.; Jeon, K.S.; Song, H.S.; Won, D.S.; Lee, S.M.; Kho, Y.T.

    1996-12-31

    The reliability and applicability of various coating defect detecting techniques are investigated utilizing mock pipe. It is shown that both close interval potential survey and dc voltage gradient methods are impertinent as field techniques: They require considerable cathodic polarization in order to effectively locate the coating defects. DC voltage gradient with current interruption technique is recommended as a viable field method in that it is able to precisely locate the defects irrespective of CP condition. Utilizing the method field survey was undertaken for the KGC`s pipeline of 120 km and 106 assumed defects were located.

  16. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE PAGESBeta

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  17. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based on their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.

  18. A rapid survey technique for Tropilaelaps mite (Mesostigmata: Laelapidae) detection.

    PubMed

    Pettis, Jeffery S; Rose, Robyn; Lichtenberg, Elinor M; Chantawannakul, Panuwan; Buawangpong, Ninat; Somana, Weeraya; Sukumalanand, Prachaval; Vanengelsdorp, Dennis

    2013-08-01

    Parasitic Tropilaelaps (Delfinado and Baker) mites are a damaging pest of European honey bees (Apis mellifera L.) in Asia. These mites represent a significant threat if introduced to other regions of the world, warranting implementation of Tropilaelaps mite surveillance in uninfested regions. Current Tropilaelaps mite-detection methods are unsuitable for efficient large scale screening. We developed and tested a new bump technique that consists of firmly rapping a honey bee brood frame over a collecting pan. Our method was easier to implement than current detection tests, reduced time spent in each apiary, and minimized brood destruction. This feasibility increase overcomes the test's decreased rate of detecting infested colonies (sensitivity; 36.3% for the bump test, 54.2% and 56.7% for the two most sensitive methods currently used in Asia). Considering this sensitivity, we suggest that screening programs sample seven colonies per apiary (independent of apiary size) and 312 randomly selected apiaries in a region to be 95% sure of detecting an incipient Tropilaelaps mite invasion. Further analyses counter the currently held view that Tropilaelaps mites prefer drone bee brood cells. Tropilaelaps mite infestation rate was 3.5 +/- 0.9% in drone brood and 5.7 +/- 0.6% in worker brood. We propose the bump test as a standard tool for monitoring of Tropilaelaps mite presence in regions thought to be free from infestation. However, regulators may favor the sensitivity of the Drop test (collecting mites that fall to the bottom of a hive on sticky boards) over the less time-intensive Bump test. PMID:24020263

  19. Use of structured personality survey techniques to indicate operator response to stressful situations

    SciTech Connect

    Waller, M.A.

    1990-01-01

    Under given circumstances, a person will tend to operate in one of four dominant orientations: (1) to perform tasks; (2) to achieve consensus; (3) to achieve understanding, or (4) to maintain structure. Historically, personality survey techniques, such as the Myers-Briggs type indicator, have been used to determine these tendencies. While these techniques can accurately reflect a person's orientation under normal social situations, under different sets of conditions, the same person may exhibit other tendencies, displaying a similar or entirely different orientation. While most do not exhibit extreme tendencies or changes of orientation, the shift in personality from normal to stressful conditions can be rather dramatic, depending on the individual. Structured personality survey techniques have been used to indicate operator response to stressful situations. These techniques have been extended to indicate the balance between orientations that the control room team has through the various levels of cognizance.

  20. A Survey of the Practices, Procedures, and Techniques in Undergraduate Organic Chemistry Teaching Laboratories

    ERIC Educational Resources Information Center

    Martin, Christopher B.; Schmidt, Monica; Soniat, Michael

    2011-01-01

    A survey was conducted of four-year institutions that teach undergraduate organic chemistry laboratories in the United States. The data include results from over 130 schools, describes the current practices at these institutions, and discusses the statistical results such as the scale of the laboratories performed, the chemical techniques applied,…

  1. A methodological intercomparison of topographic survey techniques for characterizing wadeable streams and rivers

    NASA Astrophysics Data System (ADS)

    Bangen, Sara G.; Wheaton, Joseph M.; Bouwes, Nicolaas; Bouwes, Boyd; Jordan, Chris

    2014-02-01

    Fine-scale (submeter) resolution digital elevation models (DEMs) created from high precision (subcentimeter) instruments (e.g., total station, rtkGPS, and laser scanning) have become ubiquitous in the field of fluvial geomorphology. They permit a diverse range of spatially explicit analyses including hydraulic modeling, habitat modeling, and geomorphic change detection. While previous studies have assessed the quality of specific topographic survey methods at individual sites or across a limited number of sites, an intercomparison of survey technologies across a diverse range of wadeable streams could help clarify which techniques are feasible, as well as which work best under what circumstances and for what purposes. Although a wealth of existing studies and protocols explain how to undertake each individual technique, in this study we seek to provide guidance on what techniques to use in which circumstances. We quantified the relative quality and the amount of effort spent collecting data to derive bare earth topography from an array of ground-based and airborne survey techniques. We used topographic survey data collected over the summer of 2010 from six sample reaches of varying complexity in the Lemhi River basin, Idaho, USA. We attempted to conduct complete, replicate surveys at each site using total station (TS), real-time kinematic (rtk) GPS, discrete return terrestrial laser scanner (TLS), and airborne LiDaR surveys (ALS). We evaluated the precision and accuracy of derived bare earth DEMs relative to the higher precision total station point data. Discrepancies between pairwise techniques were calculated using propagated DEM errors thresholded at a 95% confidence interval. Mean discrepancies between total station and rtkGPS DEMs were relatively low (≤ 0.05 m), yet TS data collection time was up to 2.4 times longer than rtkGPS. The ALS DEMs had lower accuracy than TS or rtkGPS DEMs, but the aerial coverage and floodplain context of the ALS data set was

  2. A Methodological Intercomparison of Topographic and Aerial Photographic Habitat Survey Techniques

    NASA Astrophysics Data System (ADS)

    Bangen, S. G.; Wheaton, J. M.; Bouwes, N.

    2011-12-01

    A severe decline in Columbia River salmonid populations and subsequent Federal listing of subpopulations has mandated both the monitoring of populations and evaluation of the status of available habitat. Numerous field and analytical methods exist to assist in the quantification of the abundance and quality of in-stream habitat for salmonids. These methods range from field 'stick and tape' surveys to spatially explicit topographic and aerial photographic surveys from a mix of ground-based and remotely sensed airborne platforms. Although several previous studies have assessed the quality of specific individual survey methods, the intercomparison of competing techniques across a diverse range of habitat conditions (wadeable headwater channels to non-wadeable mainstem channels) has not yet been elucidated. In this study, we seek to enumerate relative quality (i.e. accuracy, precision, extent) of habitat metrics and inventories derived from an array of ground-based and remotely sensed surveys of varying degrees of sophistication, as well as quantify the effort and cost in conducting the surveys. Over the summer of 2010, seven sample reaches of varying habitat complexity were surveyed in the Lemhi River Basin, Idaho, USA. Complete topographic surveys were attempted at each site using rtkGPS, total station, ground-based LiDaR and traditional airborne LiDaR. Separate high spatial resolution aerial imagery surveys were acquired using a tethered blimp, a drone UAV, and a traditional fixed-wing aircraft. Here we also developed a relatively simplistic methodology for deriving bathymetry from aerial imagery that could be readily employed by instream habitat monitoring programs. The quality of bathymetric maps derived from aerial imagery was compared with rtkGPS topographic data. The results are helpful for understanding the strengths and weaknesses of different approaches in specific conditions, and how a hybrid of data acquisition methods can be used to build a more complete

  3. Knik Glacier, Alaska; summary of 1979, 1980, and 1981 data and introduction of new surveying techniques

    USGS Publications Warehouse

    Mayo, L.R.; Trabant, D.C.

    1982-01-01

    Knik Glacier in south-central Alaska has the potential to reform Lake George, Alaska 's largest glacier-dammed lake. Measurements of surface altitude, snow depth, terminus position, glacier speed, and ice depth are being made in an attempt to determine the mechanisms that could cause a significant re-advance of the glacier. New surveying and data reduction techniques were developed by the authors and employed successfully at Knik Glacier. These include precise geodetic surveying by the ' trisection ' technique, calculation of surface altitude at a specially-fixed ' index point ' from three point measurements on a rough, moving glacier surface, and calculation of ice thickness from low frequency radar measurements. In addition, this report summarizes the data collected from 1979 to 1981 in support of this goal. (USGS)

  4. A Survey Of Techniques for Managing and Leveraging Caches in GPUs

    SciTech Connect

    Mittal, Sparsh

    2014-09-01

    Initially introduced as special-purpose accelerators for graphics applications, graphics processing units (GPUs) have now emerged as general purpose computing platforms for a wide range of applications. To address the requirements of these applications, modern GPUs include sizable hardware-managed caches. However, several factors, such as unique architecture of GPU, rise of CPU–GPU heterogeneous computing, etc., demand effective management of caches to achieve high performance and energy efficiency. Recently, several techniques have been proposed for this purpose. In this paper, we survey several architectural and system-level techniques proposed for managing and leveraging GPU caches. We also discuss the importance and challenges of cache management in GPUs. The aim of this paper is to provide the readers insights into cache management techniques for GPUs and motivate them to propose even better techniques for leveraging the full potential of caches in the GPUs of tomorrow.

  5. Knowledge based systems: A critical survey of major concepts, issues and techniques. Visuals

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-9. The objectives of the report are to: examine various techniques used to build the KBS; to examine at least one KBS in detail, i.e., a case study; to list and identify limitations and problems with the KBS; to suggest future areas of research; and to provide extensive reference materials.

  6. A survey of light-scattering techniques used in the remote monitoring of atmospheric aerosols

    NASA Technical Reports Server (NTRS)

    Deirmendjian, D.

    1980-01-01

    A critical survey of the literature on the use of light-scattering mechanisms in the remote monitoring of atmospheric aerosols, their geographical and spatial distribution, and temporal variations was undertaken to aid in the choice of future operational systems, both ground based and air or space borne. An evaluation, mainly qualitative and subjective, of various techniques and systems is carried out. No single system is found to be adequate for operational purposes. A combination of earth surface and space-borne systems based mainly on passive techniques involving solar radiation with active (lidar) systems to provide auxiliary or backup information is tentatively recommended.

  7. A Survey of Partition-Based Techniques for Copy-Move Forgery Detection

    PubMed Central

    Nathalie Diane, Wandji Nanda; Xingming, Sun; Moise, Fah Kue

    2014-01-01

    A copy-move forged image results from a specific type of image tampering procedure carried out by copying a part of an image and pasting it on one or more parts of the same image generally to maliciously hide unwanted objects/regions or clone an object. Therefore, detecting such forgeries mainly consists in devising ways of exposing identical or relatively similar areas in images. This survey attempts to cover existing partition-based copy-move forgery detection techniques. PMID:25152931

  8. A survey of simulation and diagnostic techniques for hypersonic nonequilibrium flows

    NASA Technical Reports Server (NTRS)

    Sharma, Surendra P.; Park, Chul

    1987-01-01

    The possible means of simulating nonequilibrium reacting flows in hypersonic environments, and the required diagnostic techniques, are surveyed in two categories: bulk flow behavior and determination of chemical rate parameters. Flow visualization of shock shapes for validation of computational-fluid dynamic calculations is proposed. The facilities and the operating conditions necessary to produce the required nonequilibrium conditions, the suitable optical techniques, and their sensitivity requirements, are surveyed. Shock-tubes, shock-tunnels, and ballistic ranges in a wide range of sizes and strengths are found to be useful for this purpose, but severe sensitivity requirements are indicated for the optical instruments, which can be met only by using highly-collimated laser sources. Likewise, for the determination of chemical parameters, this paper summarizes the quantities that need to be determined, required facilities and their operating conditions, and the suitable diagnostic techniques and their performance requirements. Shock tubes of various strengths are found to be useful for this purpose. Vacuum ultraviolet absorption and fluorescence spectroscopy and coherent anti-Stokes Raman spectroscopy are found to be the techniques best suited for the measurements of the chemical data.

  9. Gastronet survey on the use of one- or two-person technique for colonoscopy insertion

    PubMed Central

    2011-01-01

    Background Usually, colonoscopy insertion is performed by the colonoscopist (one-person technique). Quite common in the early days of endoscopy, the assisting nurse is now only rarely doing the insertion (two-person technique). Using the Norwegian national endoscopy quality assurance (QA) programme, Gastronet, we wanted to explore the extent of two-person technique practice and look into possible differences in performance and QA output measures. Methods 100 colonoscopists in 18 colonoscopy centres having reported their colonoscopies to Gastronet between January and December 2009 were asked if they practiced one- or two-person technique during insertion of the colonoscope. They were categorized accordingly for comparative analyses of QA indicators. Results 75 endoscopists responded to the survey (representing 9368 colonoscopies) - 62 of them (83%) applied one-person technique and 13 (17%) two-person technique. Patients age and sex distributions and indications for colonoscopy were also similar in the two groups. Caecal intubation was 96% in the two-person group compared to 92% in the one-person group (p < 0.001). Pain reports were similar in the groups, but time to the caecum was shorter and the use of sedation less in the two-person group. Conclusion Two-person technique for colonoscope insertion was practiced by a considerable minority of endoscopists (17%). QA indicators were either similar to or better than one-person technique. This suggests that there may be some beneficial elements to this technique worth exploring and try to import into the much preferred one-person insertion technique. PMID:21672243

  10. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA HANFORD WASHINGTON

    SciTech Connect

    PETERSEN SW

    2010-12-02

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM{reg_sign} system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m [328 ft] and 200 m [656 ft]) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  11. [Abortion in Brazil: a household survey using the ballot box technique].

    PubMed

    Diniz, Debora; Medeiros, Marcelo

    2010-06-01

    This study presents the first results of the National Abortion Survey (PNA, Pesquisa Nacional de Aborto), a household random sample survey fielded in 2010 covering urban women in Brazil aged 18 to 39 years. The PNA combined two techniques, interviewer-administered questionnaires and self-administered ballot box questionnaires. The results of PNA show that at the end of their reproductive health one in five women has performed an abortion, with abortions being more frequent in the main reproductive ages, that is, from 18 to 29 years old. No relevant differentiation was observed in the practice of abortion among religious groups, but abortion was found to be more common among people with lower education. The use of medical drugs to induce abortion occurred in half of the abortions, and post-abortion hospitalization was observed among approximately half of the women who aborted. Such results lead to conclude that abortion is a priority in the Brazilian public health agenda. PMID:20640252

  12. A Survey and Analysis of Techniques Used in Attracting the Black Middle-Class Patient

    PubMed Central

    Barnwell, Sydney; LaMendola, Walter F.

    1985-01-01

    This study presents a survey which is based upon the black physician's perception of the expectations of the black middle-class patient. This perception is that the middle-class expectations are low; hence, satisfaction is low, and the result is that prospective patients tend to utilize the services of white physicians. The survey was designed to sample opinions of physicians attending the 1983 annual meeting of the National Medical Association in Chicago, and it determined the most useful techniques in attracting black middle-class patients. These investigators believe that there is an immediate need of a market-concept approach utilizing the results of this study to help the black doctor market his services more effectively. Such a market concept approach is presented. PMID:3999152

  13. Chest physiotherapy techniques in neurological intensive care units of India: A survey

    PubMed Central

    Bhat, Anup; Chakravarthy, Kalyana; Rao, Bhamini K.

    2014-01-01

    Context: Neurological intensive care units (ICUs) are a rapidly developing sub-specialty of neurosciences. Chest physiotherapy techniques are of great value in neurological ICUs in preventing, halting, or reversing the impairments caused due to neurological disorder and ICU stay. However, chest physiotherapy techniques should be modified to a greater extent in the neurological ICU as compared with general ICUs. Aim: The aim of this study is to obtain data on current chest physiotherapy practices in neurological ICUs of India. Settings and Design: A tertiary care hospital in Karnataka, India, and cross-sectional survey. Subjects and Methods: A questionnaire was formulated and content validated to assess the current chest physiotherapy practices in neurological ICUs of India. The questionnaire was constructed online and a link was distributed via E-mail to 185 physiotherapists working in neurological ICUs across India. Statistical Analysis Used: Descriptive statistics. Results: The response rate was 44.3% (n = 82); 31% of the physiotherapists were specialized in cardiorespiratory physiotherapy and 30% were specialized in neurological physiotherapy. Clapping, vibration, postural drainage, aerosol therapy, humidification, and suctioning were used commonly used airway clearance (AC) techniques by the majority of physiotherapists. However, devices for AC techniques such as Flutter, Acapella, and standard positive expiratory pressure devices were used less frequently for AC. Techniques such as autogenic drainage and active cycle of breathing technique are also frequently used when appropriate for the patients. Lung expansion therapy techniques such as breathing exercises, incentive spirometry exercises, and positioning, proprioceptive neuromuscular facilitation of breathing are used by majority of physiotherapists. Conclusions: Physiotherapists in this study were using conventional chest physiotherapy techniques more frequently in comparison to the devices available for

  14. Current ablation techniques for persistent atrial fibrillation: results of the European Heart Rhythm Association Survey.

    PubMed

    Dagres, Nikolaos; Bongiorni, Maria Grazia; Larsen, Torben Bjerregaard; Hernandez-Madrid, Antonio; Pison, Laurent; Blomström-Lundqvist, Carina

    2015-10-01

    The aim of this survey was to provide insight into current practice regarding ablation of persistent atrial fibrillation (AF) among members of the European Heart Rhythm Association electrophysiology research network. Thirty centres responded to the survey. The main ablation technique for first-time ablation was stand-alone pulmonary vein isolation (PVI): in 67% of the centres for persistent but not long-standing AF and in 37% of the centres for long-standing persistent AF as well. Other applied techniques were ablation of fractionated electrograms, placement of linear lesions, stepwise approach until AF termination, and substrate mapping and isolation of low-voltage areas. However, the percentage of centres applying these techniques during first ablation did not exceed 25% for any technique. When stand-alone PVI was performed in patients with persistent but not long-standing AF, the majority (80%) of the centres used an irrigated radiofrequency ablation catheter whereas 20% of the respondents used the cryoballoon. Similar results were reported for ablation of long-standing persistent AF (radiofrequency 90%, cryoballoon 10%). Neither rotor mapping nor one-shot ablation tools were used as the main first-time ablation methods. Systematic search for non-pulmonary vein triggers was performed only in 10% of the centres. Most common 1-year success rate off antiarrhythmic drugs was 50-60%. Only 27% of the centres knew their 5-year results. In conclusion, patients with persistent AF represent a significant proportion of AF patients undergoing ablation. There is a shift towards stand-alone PVI being the primary choice in many centres for first-time ablation in these patients. The wide variation in the use of additional techniques and in the choice of endpoints reflects the uncertainties and lack of guidance regarding the most optimal approach. Procedural success rates are modest and long-term outcomes are unknown in most centres. PMID:26498718

  15. Nurse Practitioners' Use of Communication Techniques: Results of a Maryland Oral Health Literacy Survey

    PubMed Central

    Koo, Laura W.; Horowitz, Alice M.; Radice, Sarah D.; Wang, Min Q.; Kleinman, Dushanka V.

    2016-01-01

    Objectives We examined nurse practitioners’ use and opinions of recommended communication techniques for the promotion of oral health as part of a Maryland state-wide oral health literacy assessment. Use of recommended health-literate and patient-centered communication techniques have demonstrated improved health outcomes. Methods A 27-item self-report survey, containing 17 communication technique items, across 5 domains, was mailed to 1,410 licensed nurse practitioners (NPs) in Maryland in 2010. Use of communication techniques and opinions about their effectiveness were analyzed using descriptive statistics. General linear models explored provider and practice characteristics to predict differences in the total number and the mean number of communication techniques routinely used in a week. Results More than 80% of NPs (N = 194) routinely used 3 of the 7 basic communication techniques: simple language, limiting teaching to 2–3 concepts, and speaking slowly. More than 75% of respondents believed that 6 of the 7 basic communication techniques are effective. Sociodemographic provider characteristics and practice characteristics were not significant predictors of the mean number or the total number of communication techniques routinely used by NPs in a week. Potential predictors for using more of the 7 basic communication techniques, demonstrating significance in one general linear model each, were: assessing the office for user-friendliness and ever taking a communication course in addition to nursing school. Conclusions NPs in Maryland self-reported routinely using some recommended health-literate communication techniques, with belief in their effectiveness. Our findings suggest that NPs who had assessed the office for patient-friendliness or who had taken a communication course beyond their initial education may be predictors for using more of the 7 basic communication techniques. These self-reported findings should be validated with observational studies

  16. Survey of radiographic requirements and techniques in United States dental assisting programs, 1982

    SciTech Connect

    Farman, A.G.; Grammer, S.; Hunter, N.; Baker, C.

    1983-10-01

    A survey of dental assisting programs revealed little standardization of student requirements for dental radiography in the United States. Areas for concern were: the high proportion of programs in which classmates exposed one another to ionizing radiation for training purposes; and the continued use of closed cones in some cases. Preclinical laboratories in radiography were, on average, of considerably longer duration than those previously reported for dental students. Conversely, clinical requirements in intraoral techniques were less for dental assisting students than is the case for dental students. Available methods of reducing patient exposure to ionizing radiation are not being fully implemented.

  17. Vector Quantization of Harmonic Magnitudes in Speech Coding Applications—A Survey and New Technique

    NASA Astrophysics Data System (ADS)

    Chu, Wai C.

    2004-12-01

    A harmonic coder extracts the harmonic components of a signal and represents them efficiently using a few parameters. The principles of harmonic coding have become quite successful and several standardized speech and audio coders are based on it. One of the key issues in harmonic coder design is in the quantization of harmonic magnitudes, where many propositions have appeared in the literature. The objective of this paper is to provide a survey of the various techniques that have appeared in the literature for vector quantization of harmonic magnitudes, with emphasis on those adopted by the major speech coding standards; these include constant magnitude approximation, partial quantization, dimension conversion, and variable-dimension vector quantization (VDVQ). In addition, a refined VDVQ technique is proposed where experimental data are provided to demonstrate its effectiveness.

  18. Use of remote-sensing techniques to survey the physical habitat of large rivers

    USGS Publications Warehouse

    Edsall, Thomas A.; Behrendt, Thomas E.; Cholwek, Gary; Frey, Jeffery W.; Kennedy, Gregory W.; Smith, Stephen B.

    1997-01-01

    Remote-sensing techniques that can be used to quantitatively characterize the physical habitat in large rivers in the United States where traditional survey approaches typically used in small- and medium-sized streams and rivers would be ineffective or impossible to apply. The state-of-the-art remote-sensing technologies that we discuss here include side-scan sonar, RoxAnn, acoustic Doppler current profiler, remotely operated vehicles and camera systems, global positioning systems, and laser level survey systems. The use of these technologies will permit the collection of information needed to create computer visualizations and hard copy maps and generate quantitative databases that can be used in real-time mode in the field to characterize the physical habitat at a study location of interest and to guide the distribution of sampling effort needed to address other habitat-related study objectives. This report augments habitat sampling and characterization guidance provided by Meador et al. (1993) and is intended for use primarily by U.S. Geological Survey National Water Quality Assessment program managers and scientists who are documenting water quality in streams and rivers of the United States.

  19. Use of image guided radiation therapy techniques and imaging dose measurement at Indian hospitals: A survey

    PubMed Central

    Deshpande, Sudesh; Dhote, D. S.; Kumar, Rajesh; Naidu, Suresh; Sutar, A.; Kannan, V.

    2015-01-01

    A national survey was conducted to obtain information about the use of image-guided radiotherapy (IGRT) techniques and IGRT dose measurement methods being followed at Indian radiotherapy centers. A questionnaire containing parameters relevant to use of IGRT was prepared to collect the information pertaining to (i) availability and type of IGRT delivery system, (ii) frequency of image acquisition protocol and utilization of these images for different purpose, and (iii) imaging dose measurement. The questionnaire was circulated to 75 hospitals in the country having IGRT facility, and responses of 51 centers were received. Survey results showed that among surveyed hospitals, 86% centers have IGRT facility, 78% centers have kilo voltage three-dimensional volumetric imaging. 75% of hospitals in our study do not perform computed tomography dose index measurements and 89% of centers do not perform patient dose measurements. Moreover, only 29% physicists believe IGRT dose is additional radiation burden to patient. This study has brought into focus the need to design a national protocol for IGRT dose measurement and development of indigenous tools to perform IGRT dose measurements. PMID:26865758

  20. Use of image guided radiation therapy techniques and imaging dose measurement at Indian hospitals: A survey.

    PubMed

    Deshpande, Sudesh; Dhote, D S; Kumar, Rajesh; Naidu, Suresh; Sutar, A; Kannan, V

    2015-01-01

    A national survey was conducted to obtain information about the use of image-guided radiotherapy (IGRT) techniques and IGRT dose measurement methods being followed at Indian radiotherapy centers. A questionnaire containing parameters relevant to use of IGRT was prepared to collect the information pertaining to (i) availability and type of IGRT delivery system, (ii) frequency of image acquisition protocol and utilization of these images for different purpose, and (iii) imaging dose measurement. The questionnaire was circulated to 75 hospitals in the country having IGRT facility, and responses of 51 centers were received. Survey results showed that among surveyed hospitals, 86% centers have IGRT facility, 78% centers have kilo voltage three-dimensional volumetric imaging. 75% of hospitals in our study do not perform computed tomography dose index measurements and 89% of centers do not perform patient dose measurements. Moreover, only 29% physicists believe IGRT dose is additional radiation burden to patient. This study has brought into focus the need to design a national protocol for IGRT dose measurement and development of indigenous tools to perform IGRT dose measurements. PMID:26865758

  1. Survey of agents and techniques applicable to the solidification of low-level radioactive wastes

    SciTech Connect

    Fuhrmann, M.; Neilson, R.M. Jr.; Colombo, P.

    1981-12-01

    A review of the various solidification agents and techniques that are currently available or potentially applicable for the solidification of low-level radioactive wastes is presented. An overview of the types and quantities of low-level wastes produced is presented. Descriptions of waste form matrix materials, the wastes types for which they have been or may be applied and available information concerning relevant waste form properties and characteristics follow. Also included are descriptions of the processing techniques themselves with an emphasis on those operating parameters which impact upon waste form properties. The solidification agents considered in this survey include: hydraulic cements, thermoplastic materials, thermosetting polymers, glasses, synthetic minerals and composite materials. This survey is part of a program supported by the United States Department of Energy's Low-Level Waste Management Program (LLWMP). This work provides input into LLWMP efforts to develop and compile information relevant to the treatment and processing of low-level wastes and their disposal by shallow land burial.

  2. Telephone survey to investigate relationships between onychectomy or onychectomy technique and house soiling in cats.

    PubMed

    Gerard, Amanda F; Larson, Mandy; Baldwin, Claudia J; Petersen, Christine

    2016-09-15

    OBJECTIVE To determine whether associations existed between onychectomy or onychectomy technique and house soiling in cats. DESIGN Cross-sectional study. SAMPLE 281 owners of 455 cats in Polk County, Iowa, identified via a list of randomly selected residential phone numbers of cat owners in that region. PROCEDURES A telephone survey was conducted to collect information from cat owners on factors hypothesized a priori to be associated with house soiling, including cat sex, reproductive status, medical history, and onychectomy history. When cats that had undergone onychectomy were identified, data were collected regarding the cat's age at the time of the procedure and whether a carbon dioxide laser (CDL) had been used. Information on history of house soiling behavior (urinating or defecating outside the litter box) was also collected. RESULTS Onychectomy technique was identified as a risk factor for house soiling. Cats for which a non-CDL technique was used had a higher risk of house soiling than cats for which the CDL technique was used. Cats that had undergone onychectomy and that lived in a multicat (3 to 5 cats) household were more than 3 times as likely to have house soiled as were single-housed cats with intact claws. CONCLUSIONS AND CLINICAL RELEVANCE Results of this cross-sectional study suggested that use of the CDL technique for onychectomy could decrease the risk of house soiling by cats relative to the risk associated with other techniques. This and other findings can be used to inform the decisions of owners and veterinarians when considering elective onychectomy for cats. PMID:27585101

  3. Epidemiological survey of different clinical techniques of orthodontic bracket debonding and enamel polishing

    PubMed Central

    Sfondrini, Maria Francesca; Scribante, Andrea; Fraticelli, Danilo; Roncallo, Silvia; Gandini, Paola

    2015-01-01

    Objectives: To conduct an epidemiological survey of the orthodontic debonding techniques in Italy, and describe the most commonly used methods to remove the brackets and adhesive from the tooth surfaces. Materials and Methods: A survey consisting of 6 questions about bracket debonding methods and instruments used was emailed to 1000 orthodontists, who were members of the Italian Orthodontics Society (SIDO. Clinicians were characterized by different sex, age, origin, and professional experience. Results: Overall, 267 surveys were returned, representing a response rate of 26.7% of the participants interviewed. The 0.2% of the orthodontists responded, via email, confirming that they were not interested, while 3% of the questionnaires were sent back not completed. The 70.1% of the clinicians interviewed did not return any response. Overall, 64% of SIDO members (orthodontists) did not detect any enamel damage after debonding. The brackets used most frequently (89.14%) in clinical practice were the metal ones. The most commonly used pliers for bracket removal were cutters (37.08%) and bracket removal pliers (34.83%). For adhesive removal, low speed tungsten carbide burs under irrigation were the most widely utilized method for adhesive removal (40.08%), followed by high speed carbide burs (14.19%), and diamond burs (14.19%). The most frequently used instruments for polishing after debonding were rubber cups (36.70%) and abrasive discs (21.35%). The 31.21% of the orthodontists found esthetic enamel changes before bonding versus after debonding. Conclusions: This survey showed the high variability of different methods for bracket debonding, adhesive removal, and tooth polishing. The collected answers indicate that most orthodontists have developed their own armamentarium of debonding and polishing, basing their method on trials and errors. PMID:26952141

  4. A survey on acoustic signature recognition and classification techniques for persistent surveillance systems

    NASA Astrophysics Data System (ADS)

    Shirkhodaie, Amir; Alkilani, Amjad

    2012-06-01

    Application of acoustic sensors in Persistent Surveillance Systems (PSS) has received considerable attention over the last two decades because they can be rapidly deployed and have low cost. Conventional utilization of acoustic sensors in PSS spans a wide range of applications including: vehicle classification, target tracking, activity understanding, speech recognition, shooter detection, etc. This paper presents a current survey of physics-based acoustic signature classification techniques for outdoor sounds recognition and understanding. Particularly, this paper focuses on taxonomy and ontology of acoustic signatures resulted from group activities. The taxonomy and supportive ontology considered include: humanvehicle, human-objects, and human-human interactions. This paper, in particular, exploits applicability of several spectral analysis techniques as a means to maximize likelihood of correct acoustic source detection, recognition, and discrimination. Spectral analysis techniques based on Fast Fourier Transform, Discrete Wavelet Transform, and Short Time Fourier Transform are considered for extraction of features from acoustic sources. In addition, comprehensive overviews of most current research activities related to scope of this work are presented with their applications. Furthermore, future potential direction of research in this area is discussed for improvement of acoustic signature recognition and classification technology suitable for PSS applications.

  5. HRMS Sky Survey Techniques for Separating the Rare Interesting Signal from the Multitude of Background Signals

    NASA Technical Reports Server (NTRS)

    Olsen, E.; Backus, C.; Gulkis, S.; Levin, S.

    1993-01-01

    The NASA High Resolution Microwave Survey (HRMS) Sky Survey component will survey the entire celestial sphere over the microwave frequency band to search for signals of intelligent origin which originate from beyond our solar system.

  6. Knowledge based systems: A preliminary survey of selected issues and techniques

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    It is only recently that research in Artificial Intelligence (AI) is accomplishing practical results. Most of these results can be attributed to the design and use of expert systems (or Knowledge-Based Systems, KBS) - problem-solving computer programs that can reach a level of performance comparable to that of a human expert in some specialized problem domain. But many computer systems designed to see images, hear sounds, and recognize speech are still in a fairly early stage of development. In this report, a preliminary survey of recent work in the KBS is reported, explaining KBS concepts and issues and techniques used to construct them. Application considerations to construct the KBS and potential KBS research areas are identified. A case study (MYCIN) of a KBS is also provided.

  7. A Survey on Terrain Assessment Techniques for Autonomous Operation of Planetary Robots

    NASA Astrophysics Data System (ADS)

    Sancho-Pradel, D. L.; Gao, Y.

    A key challenge in autonomous planetary surface exploration is the extraction of meaningful information from sensor data, which would allow a good interpretation of the nearby terrain, and a reasonable assessment of more distant areas. In the last decade, the desire to increase the autonomy of unmanned ground vehicles (UGVs), particularly in terms of off-road navigation, has significantly increased the interest in the field of automated terrain classification. Although the field is relatively new, its advances and goals are scattered across different robotic platforms and applications. The objective of this paper is to present a survey of the field from a planetary exploration perspective, bringing together the underlying techniques, existing approaches and relevant applications under a common framework. The aim is to provide a comprehensive overview to the newcomer in the field, and a structured reference for the practitioners.

  8. Surveying co-located space geodesy techniques for ITRF computation: statistical aspects

    NASA Astrophysics Data System (ADS)

    Sillard, P.; Sarti, P.; Vittuari, L.

    2003-04-01

    For two years, CNR (ITALY) has been involved in a complete renovation of the way Space Geodesy coloocated instruments are surveyed. Local ties are one of the most problematic part of International Terrestrial Reference Frame (ITRF) computation since the accuracy of Space Geodesy techniques has decreased to a few millimeters level. Therefore everybody now agrees on the fact that local ties are one of the most problematic aspects of the ITRF computation. The CNR has then decided to start a comprehensive reflection on the way local ties should be surveyed between Space Geodesy instruments. This reflection concerns the practical ground operations, the physical definition of a Space Geodesy instrument reference point (especially for VLBI), and the consequent adjustment of the results, as well as their publication. The two first aspects will be presented in an other presentation as the present one will focus on the two last points (statistics and publication). As Space Geodesy has now reached the mm level, local ties must be used in ITRF computation with a full variance covariance matrix available for one site. The talk will present the way this variance can be derived, even when the reference point is implicitly defined, like for VLBI. Some numerical examples will be given of the quality which can be reached through a rigorous statistical treatment of the new approach developed by CNR. The evidence of the significant improvement that can be seen of the ITRF-type computation will also be given.

  9. Worldwide Enucleation Techniques and Materials for Treatment of Retinoblastoma: An International Survey

    PubMed Central

    Mourits, Daphne L.; Hartong, Dyonne T.; Bosscha, Machteld I.; Kloos, Roel J. H. M.; Moll, Annette C.

    2015-01-01

    Purpose To investigate the current practice of enucleation with or without orbital implant for retinoblastoma in countries across the world. Methods A digital survey identifying operation techniques and material used for orbital implants after enucleation in patients with retinoblastoma. Results We received a response of 58 surgeons in 32 different countries. A primary artificial implant is routinely inserted by 42 (72.4%) surgeons. Ten (17.2%) surgeons leave the socket empty, three (5.2%) decide per case. Other surgeons insert a dermis fat graft as a standard primary implant (n=1), or fill the socket in a standard secondary procedure (n=2; one uses dermis fat grafts and one artificial implants). The choice for porous implants was more frequent than for non-porous implants: 27 (58.7%) and 15 (32.6%), respectively. Both porous and non-porous implant types are used by 4 (8.7%) surgeons. Twenty-five surgeons (54.3%) insert bare implants, 11 (23.9%) use separate wrappings, eight (17.4%) use implants with prefab wrapping and two insert implants with and without wrapping depending on type of implant. Attachment of the muscles to the wrapping or implant (at various locations) is done by 31 (53.4%) surgeons. Eleven (19.0%) use a myoconjunctival technique, nine (15.5%) suture the muscles to each other and seven (12.1%) do not reattach the muscles. Measures to improve volume are implant exchange at an older age (n=4), the use of Restylane SQ (n=1) and osmotic expanders (n=1). Pegging is done by two surgeons. Conclusion No (worldwide) consensus exists about the use of material and techniques for enucleation for the treatment of retinoblastoma. Considerations for the use of different techniques are discussed. PMID:25767872

  10. Search Techniques for the Web of Things: A Taxonomy and Survey.

    PubMed

    Zhou, Yuchao; De, Suparna; Wang, Wei; Moessner, Klaus

    2016-01-01

    The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented. PMID:27128918

  11. Search Techniques for the Web of Things: A Taxonomy and Survey

    PubMed Central

    Zhou, Yuchao; De, Suparna; Wang, Wei; Moessner, Klaus

    2016-01-01

    The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented. PMID:27128918

  12. Finding Hidden Geothermal Resources in the Basin and Range Using Electrical Survey Techniques: A Computational Feasibility Study

    SciTech Connect

    J. W. Pritchett; not used on publication

    2004-12-01

    For many years, there has been speculation about "hidden" or "blind" geothermal systems—reservoirs that lack an obvious overlying surface fluid outlet. At present, it is simply not known whether "hidden" geothermal reservoirs are rare or common. An approach to identifying promising drilling targets using methods that are cheaper than drilling is needed. These methods should be regarded as reconnaissance tools, whose primary purpose is to locate high-probability targets for subsequent deep confirmation drilling. The purpose of this study was to appraise the feasibility of finding "hidden" geothermal reservoirs in the Basin and Range using electrical survey techniques, and of adequately locating promising targets for deep exploratory drilling based on the survey results. The approach was purely theoretical. A geothermal reservoir simulator was used to carry out a lengthy calculation of the evolution of a synthetic but generic Great Basin-type geothermal reservoir to a quasi-steady "natural state". Postprocessors were used to try to estimate what a suite of geophysical surveys of the prospect would see. Based on these results, the different survey techniques were compared and evaluated in terms of their ability to identify suitable drilling targets. This process was completed for eight different "reservoir models". Of the eight cases considered, four were "hidden" systems, so that the survey techniques could be appraised in terms of their ability to detect and characterize such resources and to distinguish them from more conventionally situated geothermal reservoirs. It is concluded that the best way to find "hidden" basin and range geothermal resources of this general type is to carry out simultaneous SP and low-frequency MT surveys, and then to combine the results of both surveys with other pertinent information using mathematical "inversion" techniques to characterize the subsurface quantitatively. Many such surveys and accompanying analyses can be carried out

  13. Aplication of Phase Shift Projection Moire Technique in Solid Surfaces Topographic Survey

    NASA Astrophysics Data System (ADS)

    Lino, A. C. L.; Dal Fabbro, I. M.; Enes, A. M.

    2008-04-01

    The application of projection moiré with phase shift techniques in vegetable organs surface topography survey had to step up basic procedures before reaching significant conclusions. As recommended by [1], the proposed method should be tested on virtual surfaces [1] before being carried on solid symmetric surfaces [2], followed by tests on asymmetric surfaces as fruits [3] and finally a generation of a 3D digital models of solid figures as well as of fruits [4]. In this research, identified as the step [2], tested objects included cylinders, cubes and spheres. In this sense a Ronchi grid named G1 was generated in a PC, from which other grids referred as G2, G3, and G4 were set out of phase by 1/4, 1/2 and 3/4 of period from G1. Grid G1 was then projected onto the samples surface instead of being virtually distorted, receiving the name of Gd. The difference between Gd and G1, G2, G3, and G4 followed by filtration generated the moiré fringes M1, M2, M3 and M4 respectively. Fringes are out of phase one from each other by 1/4 of period, which were processed by the Rising Sun Moiré software to produce packed phase and further on, the unpacked fringes. Final representations in gray levels as well as in contour lines showed the topography of the deformed grid Gd. Parallel line segments were projected onto moiré generated surface images to evaluate the approximation to the real surface. Line segments images were then captured by means of the ImageJ software and the corresponding curve fitting obtained. The work conclusions included the reliability of the proposed method in surveying solid figures shape.

  14. Recent mycotoxin survey data and advanced mycotoxin detection techniques reported from China: a review.

    PubMed

    Selvaraj, Jonathan Nimal; Wang, Yan; Zhou, Lu; Zhao, Yueju; Xing, Fuguo; Dai, Xiaofeng; Liu, Yang

    2015-01-01

    Mycotoxin contamination in agro-food systems has been a serious concern over the last few decades in China, where the Ministry of Health has set maximum limits for mycotoxins in different agro-products. Overall survey data show that aflatoxin contamination in infant cereals, edible oils, raw milk, ginger and its related products are far below Chinese regulatory limits. The absence of aflatoxin M1 contamination in infant milk powders indicates a high standard of control. Aflatoxins in liquorice roots and lotus seeds have been reported for the first time. For deoxynivalenol, high levels were found in wheat grown in the Yangtze Delta region, which is more prone to rainfall, supporting Fusarium infection. The emerging mycotoxins beauvericins and enniatins have been reported in the medicinal herbs in China. Ochratoxin A in wine was below the European Union regulatory limits, but fumonisins in maize need to be monitored and future regulatory control considered. Overall from all the survey data analysed in this review, it can be concluded that 92% of the samples analysed had mycotoxin levels below the Chinese regulatory limits. In terms of detection techniques in recent years, immuno-based assays have been developed largely due to their excellent sensitivity and ease of use. Assays targeting multiple mycotoxins like aflatoxins, ochratoxin A, zearalenone and deoxynivalenol have been reported using microarrays and suspension arrays targeting in particular maize, rice and peanuts. Aptamer-based assays against ochratoxin A and aflatoxins B1 and B2 have been developed involving fluorescence detection; and surface plasmon resonance immunosensors have been developed targeting wine, maize, wheat, wild rye, hay and peanut oil with high sensitivity (> 0.025 ng l(-1)). Commercialisation of these technologies is much needed for wider usage in the coming years. PMID:25604871

  15. Human-computer dialogue: Interaction tasks and techniques. Survey and categorization

    NASA Technical Reports Server (NTRS)

    Foley, J. D.

    1983-01-01

    Interaction techniques are described. Six basic interaction tasks, requirements for each task, requirements related to interaction techniques, and a technique's hardware prerequisites affective device selection are discussed.

  16. Exploring Halo Substructure with Giant Stars. I. Survey Description and Calibration of the Photometric Search Technique

    NASA Astrophysics Data System (ADS)

    Majewski, Steven R.; Ostheimer, James C.; Kunkel, William E.; Patterson, Richard J.

    2000-11-01

    We have begun a survey of the structure of the Milky Way halo, as well as the halos of other Local Group galaxies, as traced by their constituent giant stars. These giant stars are identified via large-area, CCD photometric campaigns. Here we present the basis for our photometric search method, which relies on the gravity sensitivity of the Mg I triplet+MgH features near 5150 Å in F-K stars, and which is sensed by the flux in the intermediate-band DDO51 filter. Our technique is a simplified variant of the combined Washington/DDO51 four-filter technique described by Geisler, which we modify for the specific purpose of efficiently identifying distant giant stars for follow-up spectroscopic study: We show here that for most stars the Washington T1-T2 color is correlated monotonically with the Washington M-T2 color with relatively low scatter; for the purposes of our survey, this correlation obviates the need to image in the T1 filter, as originally proposed by Geisler. To calibrate our (M-T2, M-DDO51) diagram as a means to discriminate field giant stars from nearby dwarfs, we utilize new photometry of the main sequences of the open clusters NGC 3680 and NGC 2477 and the red giant branches of the clusters NGC 3680, Melotte 66, and ω Centauri, supplemented with data on field stars, globular clusters and open clusters by Doug Geisler and collaborators. By combining the data on stars from different clusters, and by taking advantage of the wide abundance spread within ω Centauri, we verify the primary dependence of the M-DDO51 color on luminosity and demonstrate the secondary sensitivity to metallicity among giant stars. Our empirical results are found to be generally consistent with those from analysis of synthetic spectra by Paltoglou & Bell. Finally, we provide conversion formulae from the (M, M-T2) system to the (V, V-I) system, corresponding reddening laws, as well as empirical red giant branch curves from ω Centauri stars for use in deriving photometric

  17. NOS/NGS activities to support development of radio interferometric surveying techniques

    NASA Technical Reports Server (NTRS)

    Carter, W. E.; Dracup, J. F.; Hothem, L. D.; Robertson, D. S.; Strange, W. E.

    1980-01-01

    National Geodetic Survey activities towards the development of operational geodetic survey systems based on radio interferometry are reviewed. Information about the field procedures, data reduction and analysis, and the results obtained to date is presented.

  18. Preferred tools and techniques for implantation of cardiac electronic devices in Europe: results of the European Heart Rhythm Association survey.

    PubMed

    Bongiorni, Maria Grazia; Proclemer, Alessandro; Dobreanu, Dan; Marinskis, Germanas; Pison, Laurent; Blomstrom-Lundqvist, Carina

    2013-11-01

    The aim of this European Heart Rhythm Association (EHRA) survey was to assess clinical practice in relation to the tools and techniques used for cardiac implantable electronic devices procedures in the European countries. Responses to the questionnaire were received from 62 members of the EHRA research network. The survey involved high-, medium-, and low-volume implanting centres, performing, respectively, more than 200, 100-199 and under 100 implants per year. The following topics were explored: the side approach for implantation, surgical techniques for pocket incision, first venous access for lead implantation, preference of lead fixation, preferred coil number for implantable cardioverter-defibrillator (ICD) leads, right ventricular pacing site, generator placement site, subcutaneous ICD implantation, specific tools and techniques for cardiac resynchronization therapy (CRT), lead implantation sequence in CRT, coronary sinus cannulation technique, target site for left ventricular lead placement, strategy in left ventricular lead implant failure, mean CRT implantation time, optimization of the atrioventricular (AV) and ventriculo-ventricular intervals, CRT implants in patients with permanent atrial fibrillation, AV node ablation in patients with permanent AF. This panoramic view allows us to find out the operator preferences regarding the techniques and tools for device implantation in Europe. The results showed different practices in all the fields we investigated, nevertheless the survey also outlines a good adherence to the common standards and recommendations. PMID:24170423

  19. The San Pedro Mártir Open Cluster Survey: Progress, Techniques, Preliminary Results

    NASA Astrophysics Data System (ADS)

    Schuster, W.; Michel, R.; Dias, W.; Tapia-Peralta, T.; Vázquez, R.; Macfarland, J.; Chavarría, C.; Santos, C.; Moitinho, A.

    2007-05-01

    A CCD UBVRI survey of northern open clusters is being undertaken at San Pedro Mártir, Mexico, and performed using always the same instrumental setup (telescope, CCD, filters), reduction methods, and system of standards (Landolt). To date more than 300 clusters (mostly unstudied previously) have been observed, and about half the data reduced using aperture-photometry and PSF techniques. Our analysis procedures are being refined by studying in detail a small subset of these clusters. For example, the heavily reddened clusters Be80 and Be95 are being examined in the color-color diagrams: (B-V,U-B) and (B-V,R-I) to better understand the problems of curvature and variable reddening. For clusters for which our U data reaches the F-type stars, such as NGC2192 and NGC7296, techniques are being examined for estimating both the reddening E(B-V) and metallicity [Fe/H] via the use of the (U-B) excess. If the clusters also have "red clump" stars, such as NGC1798 and Do02, these procedures can be iterated between the clump and main sequence stars to establish even better the values of E(B-V) and [Fe/H]. Finally, color-magnitude diagrams, such as (B-V,V) and (V-I,V), are being employed together with the Schmidt-Kaler colors and Padova isochrones to obtain distances and ages for these clusters. A java-based computer program is being developed to help in the visualization and analysis of these photometric data. This system is capable of displaying each cluster simultaneously in different color-color and color-magnitude diagrams and has an interactive way to identify a star, or group of stars, in one diagram and to see were it falls in the other diagrams, facilitating the elimination of field stars and the apperception of cluster features. This program is capable of displaying up to 16 different diagrams for one cluster and processing up to 20 clusters at the same time. Our aims are the following: (1) a common UBVRI photometric scale for open clusters, (2) an atlas of color

  20. RESOLVE Survey Photometry and Volume-limited Calibration of the Photometric Gas Fractions Technique

    NASA Astrophysics Data System (ADS)

    Eckert, Kathleen D.; Kannappan, Sheila J.; Stark, David V.; Moffett, Amanda J.; Norris, Mark A.; Snyder, Elaine M.; Hoversten, Erik A.

    2015-09-01

    We present custom-processed ultraviolet, optical, and near-infrared photometry for the REsolved Spectroscopy of a Local VolumE (RESOLVE) survey, a volume-limited census of stellar, gas, and dynamical mass within two subvolumes of the nearby universe (RESOLVE-A and RESOLVE-B). RESOLVE is complete down to baryonic mass ˜ {10}9.1-9.3 {M}⊙ , probing the upper end of the dwarf galaxy regime. In contrast to standard pipeline photometry (e.g., SDSS), our photometry uses optimal background subtraction, avoids suppressing color gradients, and employs multiple flux extrapolation routines to estimate systematic errors. With these improvements, we measure brighter magnitudes, larger radii, bluer colors, and a real increase in scatter around the red sequence. Combining stellar mass estimates based on our optimized photometry with the nearly complete H i mass census for RESOLVE-A, we create new z = 0 volume-limited calibrations of the photometric gas fractions (PGF) technique, which predicts gas-to-stellar mass ratios (G/S) from galaxy colors and optional additional parameters. We analyze G/S-color residuals versus potential third parameters, finding that axial ratio is the best independent and physically meaningful third parameter. We define a “modified color” from planar fits to G/S as a function of both color and axial ratio. In the complete galaxy population, upper limits on G/S bias linear and planar fits. We therefore model the entire PGF probability density field, enabling iterative statistical modeling of upper limits and prediction of full G/S probability distributions for individual galaxies. These distributions have two-component structure in the red color regime. Finally, we use the RESOLVE-B 21 cm census to test several PGF calibrations, finding that most systematically under- or overestimate gas masses, but the full probability density method performs well.

  1. Relationships between autofocus methods for SAR and self-survey techniques for SONAR. [Synthetic Aperture Radar (SAR)

    SciTech Connect

    Wahl, D.E.; Jakowatz, C.V. Jr.; Ghiglia, D.C.; Eichel, P.H.

    1991-01-01

    Autofocus methods in SAR and self-survey techniques in SONAR have a common mathematical basis in that they both involve estimation and correction of phase errors introduced by sensor position uncertainties. Time delay estimation and correlation methods have been shown to be effective in solving the self-survey problem for towed SONAR arrays. Since it can be shown that platform motion errors introduce similar time-delay estimation problems in SAR imaging, the question arises as to whether such techniques could be effectively employed for autofocus of SAR imagery. With a simple mathematical model for motion errors in SAR, we will show why such correlation/time-delay techniques are not nearly as effective as established SAR autofocus algorithms such as phase gradient autofocus or sub-aperture based methods. This analysis forms an important bridge between signal processing methodologies for SAR and SONAR. 5 refs., 4 figs.

  2. Research Methodology in the Information Age: A Comparison of Two Survey Techniques.

    ERIC Educational Resources Information Center

    Ouimet, Judith A.; Hanson, Gary R.

    Historically, data have been collected from survey participants through a paper-and-pencil questionnaire or through interviews in person or on the telephone. This study compares the use of a new approach, interactive telephone data collection (ITDCT) to traditional paper-and-pencil collection. ITDCT administers survey items through a digitized…

  3. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    PubMed Central

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  4. DECONTAMINATION TECHNIQUES FOR MOBILE RESPONSE EQUIPMENT USED AT WASTE SITES (STATE-OF-THE-ART SURVEY)

    EPA Science Inventory

    A state-of-the-art review of facility and equipment decontamination, contamination assessment, and contamination avoidance has been conducted. The review, based on an intensive literature search and a survey of various equipment manufacturers, provides preliminary background mate...

  5. Land-based lidar mapping: a new surveying technique to shed light on rapid topographic change

    USGS Publications Warehouse

    Collins, Brian D.; Kayen, Robert

    2006-01-01

    The rate of natural change in such dynamic environments as rivers and coastlines can sometimes overwhelm the monitoring capacity of conventional surveying methods. In response to this limitation, U.S. Geological Survey (USGS) scientists are pioneering new applications of light detection and ranging (lidar), a laser-based scanning technology that promises to greatly increase our ability to track rapid topographic changes and manage their impact on affected communities.

  6. Indigo snake capture methods: effectiveness of two survey techniques for Drymarchon couperi in Georgia

    USGS Publications Warehouse

    Hyslop, N.L.; Meyers, J.M.; Cooper, R.J.; Stevenson, J.

    2009-01-01

    Drymarchon couperi (Eastern Indigo Snake), a federally threatened species of the southeastern Coastal Plain, has presented challenges for surveyors, with few reliable methods developed for its detection or monitoring. Surveys for D. couperi at potential underground shelters conducted in late fall through early spring have been relatively successful when conducted by experienced surveyors, especially in the northern portions of the range. However, trapping efforts for D. couperi conducted throughout the range have met with limited success. To further evaluate detection methods, we conducted trapping and surveying from December 2002 to April 2004 in areas known to support D. couperi in southeastern Georgia. We captured 18 D. couperi through surveys of potential underground shelters from December 2002 to March 2003 (14 person-hours per capture) and six individuals through trapping (141 trap days or 27 in-field person-hours per capture). Trapping was most successful during early fall, a period when surveys are often less effective compared to those conducted in late fall through early spring. We recommend a combination of surveys from mid-fall through March in conjunction with trapping, especially from late-summer through fall in the northern portions of the snake?s range. We also recommend further experimentation with alternative trap designs and survey methods for D. couperi.

  7. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    PubMed Central

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  8. A survey of nested grid techniques and their potential for use within the MASS weather prediction model

    NASA Technical Reports Server (NTRS)

    Koch, Steven E.; Mcqueen, Jeffery T.

    1987-01-01

    A survey of various one- and two-way interactive nested grid techniques used in hydrostatic numerical weather prediction models is presented and the advantages and disadvantages of each method are discussed. The techniques for specifying the lateral boundary conditions for each nested grid scheme are described in detail. Averaging and interpolation techniques used when applying the coarse mesh grid (CMG) and fine mesh grid (FMG) interface conditions during two-way nesting are discussed separately. The survey shows that errors are commonly generated at the boundary between the CMG and FMG due to boundary formulation or specification discrepancies. Methods used to control this noise include application of smoothers, enhanced diffusion, or damping-type time integration schemes to model variables. The results from this survey provide the information needed to decide which one-way and two-way nested grid schemes merit future testing with the Mesoscale Atmospheric Simulation System (MASS) model. An analytically specified baroclinic wave will be used to conduct systematic tests of the chosen schemes since this will allow for objective determination of the interfacial noise in the kind of meteorological setting for which MASS is designed. Sample diagnostic plots from initial tests using the analytic wave are presented to illustrate how the model-generated noise is ascertained. These plots will be used to compare the accuracy of the various nesting schemes when incorporated into the MASS model.

  9. [Surgical treatment of hemorrhoids using Milligan-Morgan technique. Survey of 366 cases].

    PubMed

    Latteri, M; Grassi, N; Salanitro, L; Pantuso, G; Bottino, A; Gitto, C; Farro, G

    1991-10-31

    After a careful review of the Author's own case list and of the literature on this subject, Milligan-Morgans' technique is assessed and compared with different surgical techniques as far as early and late complications are concerned. The Authors conclude that the Milligan-Morgan technique is to be preferred because of its simplicity, safety and flexibility, particularly if associated with sphincterectomy, with or without rhagade, in order to prevent the cicatricial scars. PMID:1766559

  10. Anaesthesia Techniques for Maxillary Molars – A Questionnaire-Based Retrospective Field Survey of Dentist in Western India

    PubMed Central

    Mittal, Priya

    2016-01-01

    Introduction Clinicians use various anaesthesia techniques like Posterior Superior Alveolar (PSA) nerve block, buccal infiltration with or without supplemental anaesthesia like palatal and intraligamentary infiltrations for root canal treatment in maxillary molars. However there is no general consensus regarding which technique is enough for performing endodontic treatment in maxillary molars. Aim The aim of this questionnaire-based survey is to compare and evaluate the various techniques used to anaesthetize the maxillary molars and its effect on postoperative pain. Materials and Methods The data were obtained from 290 dental practitioners using a specially prepared questionnaire survey conducted anonymously. The questionnaire contained questions covering data such as years in dentistry, acquired specialty, techniques used for anaesthetizing maxillary molars, success of anaesthesia, and postoperative pain, etc. Results Buccal infilteration with supplemental anaesthesia in the form of palatal (82%) and intra-ligamentary infilteration (88%) show higher success rate compared to only buccal infilteration (69%). However, intra-ligamentary infilteration group showed highest rate (75%) of postoperative pain. General practitioners (62% of clinicians) prefer to give both buccal and palatal infilterations and specialists opt for only buccal infilteration (66-74% of specialists). Conclusion Only buccal infilteration is sufficient during root canal treatment of maxillary molars. Routine use of supplemental anaesthesia in the form of palatal and intra-ligamentary infilteration is not necessary unless patient experiences discomfort during endodontic treatment. However, intra-ligamentary infilteration may lead to postoperative discomfort in the form of pain. PMID:27134993

  11. Raising Money Through Gift Clubs: A Survey of Techniques at 42 Institutions.

    ERIC Educational Resources Information Center

    Sweeney, Robert D., Comp.

    The way that 42 private schools, colleges, and universities use gift clubs to motivate donors is examined. Based on a nationwide survey, information is presented on the clubs' origins, requirements for membership, methods of enlisting new members, and ways of encouraging current members to increase gifts. Attention is also directed to the clubs'…

  12. Main principles and technique of electronystagmography (a brief survey of the literature)

    NASA Technical Reports Server (NTRS)

    Tanchev, K. S.

    1980-01-01

    Electronystagmography (ENG) is one of the modern methods for objective recording of nystagmus, its quantitative and qualitative assessment. It is used more and more often in clinical practice. A brief review of the history of recording of nystagmus and a survey of the relevant literature is presented.

  13. MALT-45: a 7 mm survey of the southern Galaxy - I. Techniques and spectral line data

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher H.; Walsh, Andrew J.; Lowe, Vicki; Voronkov, Maxim A.; Ellingsen, Simon P.; Breen, Shari L.; Purcell, Cormac R.; Barnes, Peter J.; Burton, Michael G.; Cunningham, Maria R.; Hill, Tracey; Jackson, James M.; Longmore, Steven N.; Peretto, Nicolas; Urquhart, James S.

    2015-04-01

    We present the first results from the MALT-45 (Millimetre Astronomer's Legacy Team-45 GHz) Galactic Plane survey. We have observed 5 square degrees (l = 330°-335°, b = ±0.5°) for spectral lines in the 7 mm band (42-44 and 48-49 GHz), including CS (1-0), class I CH3OH masers in the 7(0,7)-6(1,6) A+ transition and SiO (1-0) v = 0, 1, 2, 3. MALT-45 is the first unbiased, large-scale, sensitive spectral line survey in this frequency range. In this paper, we present data from the survey as well as a few intriguing results; rigorous analyses of these science cases are reserved for future publications. Across the survey region, we detected 77 class I CH3OH masers, of which 58 are new detections, along with many sites of thermal and maser SiO emission and thermal CS. We found that 35 class I CH3OH masers were associated with the published locations of class II CH3OH, H2O and OH masers but 42 have no known masers within 60 arcsec. We compared the MALT-45 CS with NH3 (1,1) to reveal regions of CS depletion and high opacity, as well as evolved star-forming regions with a high ratio of CS to NH3. All SiO masers are new detections, and appear to be associated with evolved stars from the Spitzer Galactic Legacy Infrared Mid-Plane Survey Extraordinaire (GLIMPSE). Generally, within SiO regions of multiple vibrational modes, the intensity decreases as v = 1, 2, 3, but there are a few exceptions where v = 2 is stronger than v = 1.

  14. A survey of imagery techniques for semantic labeling of human-vehicle interactions in persistent surveillance systems

    NASA Astrophysics Data System (ADS)

    Elangovan, Vinayak; Shirkhodaie, Amir

    2011-06-01

    Understanding and semantic annotation of Human-Vehicle Interactions (HVI) facilitate fusion of Hard sensor (HS) and Human Intelligence (HUMINT) in a cohesive way. By characterization, classification, and discrimination of HVI patterns pertinent threats may be realized. Various Persistent Surveillance System (PSS) imagery techniques have been proposed in the past decade for identifying human interactions with various objects in the environment. Understanding of such interactions facilitates to discover human intentions and motives. However, without consideration of incidental context, reasoning and analysis of such behavioral activities is a very challenging and difficult task. This paper presents a current survey of related publications in the area of context-based Imagery techniques applied for HVI recognition, in particular, it discusses taxonomy and ontology of HVI and presents a summary of reported robust image processing techniques for spatiotemporal characterization and tracking of human targets in urban environments. The discussed techniques include model-based, shape-based and appearance-based techniques employed for identification and classification of objects. A detailed overview of major past research activities related to HVI in PSS with exploitation of spatiotemporal reasoning techniques applied to semantic labeling of the HVI is also presented.

  15. The restoration of the Sangallo bastion in Fano: researches, surveyings, and conservation techniques.

    PubMed

    Galli, Claudio; Rosanò, Pietro

    2003-11-01

    The restoration plan, still in progress, of the Sangallo bastion in Fano, by Antonio da Sangallo the Younger, is the result of a deep research to sketch the chronographic complexity of the monument. By direct surveys--such as endoscopic tests, georadar profiles, chemical analysis in order to monitor the nature and the deterioration of the materials--it was possible to develop an "in itinere" conservation plan, explaining the obtained results with the historical research. PMID:14703862

  16. Pigments with or without organic binder? A survey of wall painting techniques during Antiquity

    SciTech Connect

    Walter, P.

    1996-01-01

    The identification of ancient artistic techniques is based on laboratory studies and, for historical cases, also on literary sources. An analytical approach using the techniques of physical chemistry reveals the technical expertise of the artists, right at the dawn of art. In the case of prehistoric parietal art, we show that the artists prepared their pigments with different ground and mixed minerals. They applied their material onto the wall and the particles remained embedded in the superficial calcite layer. Later, the prehistoric people prepared a real paint with the proper pigment, an extender and an organic binder to fix the paint on the wall. During Antiquity, new techniques appear. The paint is applied to the natural or artificial wall and is executed, either directly or on a previously applied plaster. The aim of this paper is to describe the evolution of the techniques. The underlying chemistry provides some interesting clues on the technical choices. {copyright} {ital 1996 American Institute of Physics.}

  17. Monitoring Fine-Sediment Volume in the Colorado River Ecosystem, Arizona; Bathymetric Survey Techniques

    USGS Publications Warehouse

    Kaplinski, Matt; Hazel, Joseph E., Jr.; Parnell, Rod; Breedlove, Mike; Kohl, Keith; Gonzales, Mark

    2009-01-01

    In 2002, a fine-grained sediment (sand, silt, and clay) monitoring effort was initiated in the Colorado River ecosystem, the river corridor downstream from Glen Canyon Dam, to directly survey channel topography at scales previously unobtainable in this canyon setting. This report presents an overview of the equipment and the methods used to collect and process the high-resolution bathymetric data required for this monitoring effort. The survey methods were employed in up to 11 discrete reaches during various time intervals. The reaches varied in length from 1.3 to 6.4 km. An assessment of depth-measurement uncertainty is presented that shows the surveys meet or exceed the requirement needed to detect changes at the 0.25-m level with 95 percent confidence. These data, in the form of high-resolution digital elevation models, will be integrated in a geographic information system and used to compare maps of topography, grain size, and other information to study the spatial distribution of fine sediment in this system.

  18. Terrestrial Laser Scanning for Quantifying Habitat and Hydraulic Complexity Measures: A Comparison with Traditional Surveying Techniques

    NASA Astrophysics Data System (ADS)

    Resop, J. P.; Kozarek, J. L.; Hession, W. C.

    2010-12-01

    Accurate stream topography measurement is important for many ecological applications such as hydraulic modeling and habitat characterization. Measures of habitat complexity are often difficult to quantify or are performed qualitatively. Traditional surveying with a total station can be time intensive and limited by poor spatial resolution. These problems lead to measurement and interpolation errors, which propagate to model uncertainty. Terrestrial laser scanning (TLS) has the potential to measure topography at a high resolution and accuracy. Two methods, total station surveying and TLS, were used to measure a 100-m forested reach on the Staunton River in Shenandoah National Park, VA, USA. The TLS dataset was post-processed to remove vegetation and create a 2-cm digital elevation model (DEM). The position and size of ten rocks were compared for each method. An algorithm was developed for delineating rocks within the stream channel from the TLS DEM. Ecological metrics based on the structural complexity of the stream, such as percent in-stream rock cover and cross-sectional heterogeneity, were derived from the TLS dataset for six habitat areas and compared with the estimates from traditional methods. Compared to TLS, total station surveying underestimated rock volume and cross-sectional heterogeneity by 55% and 41%, respectively. TLS has the potential to quantify habitat complexity measures in an automated, unbiased manner.

  19. Comparison of microbial and sorbed soil gas surgace geochemical techniques with seismic surveys from the Southern Altiplano, Bolivia

    SciTech Connect

    Aranibar, O.R.; Tucker, J.D.; Hiltzman, D.C.

    1995-12-31

    Yacimientos Petroliferos Fiscales Bolivianos (YPFB) undertook a large seismic evaluation in the southern Altiplano, Bolivia in 1994. As an additional layer of information, sorbed soil gas and Microbial Oil Survey Technique (MOST) geochemical surveys were conducted to evaluate the hydrocarbon microseepage potential. The Wara Sara Prospect had 387 sorbed soil gas samples, collected from one meter depth, and 539 shallow soil microbial samples, collected from 15 to 20 centimeter depth. The sorbed soil gas samples were collected every 500 meters and microbial samples every 250 meters along geochemical traverses spaced 1 km apart. The presence of anmalous hydrocarbon microseepage is indicated by (1) a single hydrocarbon source identified by gas crossplots, (2) the high gas values with a broad range, (3) the high overall gas average, (4) the clusters of elevated samples, and (5) the right hand skewed data distributions.

  20. Precise near-earth navigation with GPS: A survey of techniques

    NASA Technical Reports Server (NTRS)

    Yunck, T. P.; Wu, S. C.; Wu, J.

    1987-01-01

    The tracking accuracy of the low earth orbiters (below about 3000 km altitude) can be brought below 10 cm with a variety of differential techniques that exploit the Global Positioning System (GPS). All of these techniques require a precisely known global network of GPS ground receivers and a receiver aboard the user satellite, and all simultaneously estimate the user and GPS satellite orbits. Three basic approaches are the geometric, dynamic, and nondynamic strategies. The last combines dynamic GPS solutions with a geometric user solution. Two powerful extensions of the nondynamic strategy show considerable promise. The first uses an optimized synthesis of dynamics and geometry in the user solution, while the second uses a novel gravity-adjustment method to exploit data from repeat ground tracks. These techniques will offer sub-decimeter accuracy for dynamically unpredictable satellites down to the lowesst possible altitudes.

  1. Test techniques: A survey paper on cryogenic tunnels, adaptive wall test sections, and magnetic suspension and balance systems

    NASA Technical Reports Server (NTRS)

    Kilgore, Robert A.; Dress, David A.; Wolf, Stephen W. D.; Britcher, Colin P.

    1989-01-01

    The ability to get good experimental data in wind tunnels is often compromised by things seemingly beyond our control. Inadequate Reynolds number, wall interference, and support interference are three of the major problems in wind tunnel testing. Techniques for solving these problems are available. Cryogenic wind tunnels solve the problem of low Reynolds number. Adaptive wall test sections can go a long way toward eliminating wall interference. A magnetic suspension and balance system (MSBS) completely eliminates support interference. Cryogenic tunnels, adaptive wall test sections, and MSBS are surveyed. A brief historical overview is given and the present state of development and application in each area is described.

  2. Multispectral techniques for general geological surveys evaluation of a four-band photographic system

    NASA Technical Reports Server (NTRS)

    Crowder, D., F.

    1969-01-01

    A general geological survey at 1:62,500 scale of the well exposed rocks of the White Mountains and the adjacent volcanic desert plateau is reported. The tuffs, granites, sedimentary rocks and metavolcanic rocks in this arid region are varicolored and conventional black and white aerial photographs have been a useful mapping aid. A large number of true color and false color aerial photographs and multispectral viewer screen images of the study area are evaluated in order to consider what imagery is the most useful for distinguishing rock types. Photographs of true color film are judged the most useful for recognizing geographic locations.

  3. New Data Reduction Techniques for Circumstellar Disk Imaging with the Hubble DICE Survey

    NASA Astrophysics Data System (ADS)

    Wilson, Benjamin; Griggs, Zachary; Gardner, Clay; Carson, Joseph; Schneider, Glenn; Stark, Christopher C.; HST/GO 12228 Team

    2015-01-01

    We present a status report on our efforts to develop an image processing pipeline that combines multiple tools in order to improve the effective sensitivity of Hubble Space Telescope (HST) STIS imaging observations of circumstellar disks around young, nearby stars. The pipeline incorporates a combination of MRRR, LOCI, RSS, RAM, Shizzle, and smoothing algorithms to strip away the overwhelming light from the parent star, remove outlying pixel values, and output high-resolution, sub-pixelated, final images. The developed pipeline has been applied to data collected as part of the Hubble DICE Survey (GO 12228) in an effort to reveal disk substructures which may be signposts of planet formation.

  4. A deep survey for Galactic Wolf-Rayet stars. I - Motivation, search technique, and first results

    NASA Technical Reports Server (NTRS)

    Shara, Michael M.; Smith, Lindsey F.; Potter, Michael; Moffat, Anthony F. J.

    1991-01-01

    Results are presented from a survey of large areas of the southern Milky Way for Wolf-Rayet (WR) stars to 17-18th magnitude, carried out using direct narrowband and broadband Schmidt plates. Thirteen new WR stars were detected in an about 40-deg-sq region in Carina, where 24 WR stars were already known; the new stars were found to be significantly redder, fainter, and farther away than the known stars. Of the new WR stars, 11 are of subtype WN, and two are WC, compared to the 17 WN and seven WC stars among the previously known WR stars in the same area.

  5. Minimum detectable concentration as a function of gamma walkover survey technique.

    PubMed

    King, David A; Altic, Nickolas; Greer, Colt

    2012-02-01

    Gamma walkover surveys are often performed by swinging the radiation detector (e.g., a 2-inch by 2-inch sodium iodide) in a serpentine pattern at a near constant height above the ground surface. The objective is to survey an approximate 1-m swath with 100% coverage producing an equal probability of detecting contamination at any point along the swing. In reality, however, the detector height will vary slightly along the swing path, and in some cases the detector may follow a pendulum-like motion significantly reducing the detector response and increasing the minimum detectable concentration. This paper quantifies relative detector responses for fixed and variable height swing patterns and demonstrates negative impacts on the minimum detectable concentration. Minimum detectable concentrations are calculated for multiple contaminated surface areas (0.1, 1.0, 3, 10, and 30 m2), multiple contaminants (60Co, 137Cs, 241Am, and 226Ra), and two minimum heights (5 and 10 cm). Exposure rate estimates used in minimum detectable concentration calculations are produced using MicroShield™ v.7.02 (Grove Software, Inc., 4925 Boonsboro Road #257, Lynchberg, VA 24503) and MDCs are calculated as outlined in NUREG-1575. Results confirm a pendulum-like detector motion can significantly increase MDCs relative to a low flat trajectory, especially for small areas of elevated activity--up to a 47% difference is observed under worst-modeled conditions. PMID:22249469

  6. A survey of provably correct fault-tolerant clock synchronization techniques

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1988-01-01

    Six provably correct fault-tolerant clock synchronization algorithms are examined. These algorithms are all presented in the same notation to permit easier comprehension and comparison. The advantages and disadvantages of the different techniques are examined and issues related to the implementation of these algorithms are discussed. The paper argues for the use of such algorithms in life-critical applications.

  7. Water vapor as an error source in microwave geodetic systems: Background and survey of calibration techniques. [very long base interferometry

    NASA Technical Reports Server (NTRS)

    Claflin, E. S.; Resch, G. M.

    1980-01-01

    Water vapor as an error source in radio interferometry systems is briefly examined. At microwave frequencies, the delay imposed by tropospheric water vapor becomes a limiting error source for high accuracy geodetic systems. The mapping of tropospheric induced errors into 'solved-for' parameters depends upon baseline length and observing strategy. Simulation analysis (and experience) indicates that in some cases, errors in estimating tropospheric delay can be magnified in their effect on baseline components. The various techniques by which tropospheric water can be estimated or measured are surveyed with particular consideration to their possible use as a calibration technique in support to very long baseline interferometry experiments. The method of remote sensing using a microwave radiometer seems to be the most effective way to provide an accurate estimate of water vapor delay.

  8. A Survey on Large High-Resolution Display Technologies, Techniques, and Applications

    SciTech Connect

    Ni, Tao; Schmidt, Greg S.; Staadt, Oliver G.; Livingston, Mark A.; Ball, Robert; May, Richard A.

    2006-03-27

    Continued advances in display hardware, computing power, networking, and rendering algorithms have all converged to dramatically improve large high-resolution display capabilities. We present a survey on prior research with large high-resolution displays. In the hardware configurations section we examine systems including multi-monitor workstations, recon*gurable projector arrays, and others. Rendering and the data pipeline are addressed with an overview of current technologies. We discuss many applications for large high-resolution displays such as automotive design, scientific visualization, control centers, and others. Quantifying the effect of large high-resolution displays on human performance and other aspects is important as we look toward future advances in display technology and how it is applied in different situations. Interacting with these displays brings a different set of challenges for HCI professionals, so an overview of some of this work is provided. Finally, we present our view of the top ten greatest challenges in large high-resolution displays.

  9. Survey of techniques for reduction of wind turbine blade trailing edge noise.

    SciTech Connect

    Barone, Matthew Franklin

    2011-08-01

    Aerodynamic noise from wind turbine rotors leads to constraints in both rotor design and turbine siting. The primary source of aerodynamic noise on wind turbine rotors is the interaction of turbulent boundary layers on the blades with the blade trailing edges. This report surveys concepts that have been proposed for trailing edge noise reduction, with emphasis on concepts that have been tested at either sub-scale or full-scale. These concepts include trailing edge serrations, low-noise airfoil designs, trailing edge brushes, and porous trailing edges. The demonstrated noise reductions of these concepts are cited, along with their impacts on aerodynamic performance. An assessment is made of future research opportunities in trailing edge noise reduction for wind turbine rotors.

  10. Dietary survey methods. 1. A semi-weighted technique for measuring dietary intake within families.

    PubMed

    Nelson, M; Nettleton, P A

    1980-10-01

    Family diet studies which measure total family consumption can determine only the average nutrient intake. A method has been devised to measure all family members' individual diets concurrently in order to learn how food and nutrient intake is distributed within the family. In this semi-weighed method, the total quantity of food available for consumption by the family is weighted at time of preparation or serving, and the distribution between family members is recorded in household measures. The method is described in detail. It provides data on individual consumption with an accuracy approaching that of a weighed survey. A co-operation rate of 73 per cent in a random sample of 74 households with two adults and two or three children indicates that this semi-weighed method can be used to assess family diets in a broad cross-section of socio-economic backgounds. PMID:7419908

  11. Development of predictive mapping techniques for soil survey and salinity mapping

    NASA Astrophysics Data System (ADS)

    Elnaggar, Abdelhamid A.

    Conventional soil maps represent a valuable source of information about soil characteristics, however they are subjective, very expensive, and time-consuming to prepare. Also, they do not include explicit information about the conceptual mental model used in developing them nor information about their accuracy, in addition to the error associated with them. Decision tree analysis (DTA) was successfully used in retrieving the expert knowledge embedded in old soil survey data. This knowledge was efficiently used in developing predictive soil maps for the study areas in Benton and Malheur Counties, Oregon and accessing their consistency. A retrieved soil-landscape model from a reference area in Harney County was extrapolated to develop a preliminary soil map for the neighboring unmapped part of Malheur County. The developed map had a low prediction accuracy and only a few soil map units (SMUs) were predicted with significant accuracy, mostly those shallow SMUs that have either a lithic contact with the bedrock or developed on a duripan. On the other hand, the developed soil map based on field data was predicted with very high accuracy (overall was about 97%). Salt-affected areas of the Malheur County study area are indicated by their high spectral reflectance and they are easily discriminated from the remote sensing data. However, remote sensing data fails to distinguish between the different classes of soil salinity. Using the DTA method, five classes of soil salinity were successfully predicted with an overall accuracy of about 99%. Moreover, the calculated area of salt-affected soil was overestimated when mapped using remote sensing data compared to that predicted by using DTA. Hence, DTA could be a very helpful approach in developing soil survey and soil salinity maps in more objective, effective, less-expensive and quicker ways based on field data.

  12. Mass spectrometry techniques in the survey of steroid metabolites as potential disease biomarkers: a review

    PubMed Central

    Gouveia, Maria João; Brindley, Paul J.; Santos, Lúcio Lara; da Costa, José Manuel Correia; Gomes, Paula; Vale, Nuno

    2013-01-01

    Mass spectrometric approaches have been fundamental to the identification of metabolites associated with steroid hormones, yet this topic has not been reviewed in depth in recent years. To this end, and given the increasing relevance of liquid chromatrography-mass spectrometry (LC-MS) studies on steroid hormones and their metabolites, the present review addresses this subject. This review provides a timely summary of the use of various mass spectrometry-based analytical techniques during the evaluation of steroidal biomarkers in a range of human disease settings. The sensitivity and specificity of these technologies are clearly providing valuable new insights into breast cancer and cardiovascular disease. We aim to contribute to an enhanced understanding of steroid metabolism and how it can be profiled by LC-MS techniques. PMID:23664145

  13. Different types of maximum power point tracking techniques for renewable energy systems: A survey

    NASA Astrophysics Data System (ADS)

    Khan, Mohammad Junaid; Shukla, Praveen; Mustafa, Rashid; Chatterji, S.; Mathew, Lini

    2016-03-01

    Global demand for electricity is increasing while production of energy from fossil fuels is declining and therefore the obvious choice of the clean energy source that is abundant and could provide security for development future is energy from the sun. In this paper, the characteristic of the supply voltage of the photovoltaic generator is nonlinear and exhibits multiple peaks, including many local peaks and a global peak in non-uniform irradiance. To keep global peak, MPPT is the important component of photovoltaic systems. Although many review articles discussed conventional techniques such as P & O, incremental conductance, the correlation ripple control and very few attempts have been made with intelligent MPPT techniques. This document also discusses different algorithms based on fuzzy logic, Ant Colony Optimization, Genetic Algorithm, artificial neural networks, Particle Swarm Optimization Algorithm Firefly, Extremum seeking control method and hybrid methods applied to the monitoring of maximum value of power at point in systems of photovoltaic under changing conditions of irradiance.

  14. I Environmental DNA sampling is more sensitive than a traditional survey technique for detecting an aquatic invader.

    PubMed

    Smart, Adam S; Tingley, Reid; Weeks, Andrew R; van Rooyen, Anthony R; McCarthy, Michael A

    2015-10-01

    Effective management of alien species requires detecting populations in the early stages of invasion. Environmental DNA (eDNA) sampling can detect aquatic species at relatively low densities, but few studies have directly compared detection probabilities of eDNA sampling with those of traditional sampling methods. We compare the ability of a traditional sampling technique (bottle trapping) and eDNA to detect a recently established invader, the smooth newt Lissotriton vulgaris vulgaris, at seven field sites in Melbourne, Australia. Over a four-month period, per-trap detection probabilities ranged from 0.01 to 0.26 among sites where L. v. vulgaris was detected, whereas per-sample eDNA estimates were much higher (0.29-1.0). Detection probabilities of both methods varied temporally (across days and months), but temporal variation appeared to be uncorrelated between methods. Only estimates of spatial variation were strongly correlated across the two sampling techniques. Environmental variables (water depth, rainfall, ambient temperature) were not clearly correlated with detection probabilities estimated via trapping, whereas eDNA detection probabilities were negatively correlated with water depth, possibly reflecting higher eDNA concentrations at lower water levels. Our findings demonstrate that eDNA sampling can be an order of magnitude more sensitive than traditional methods, and illustrate that traditional- and eDNA-based surveys can provide independent information on species distributions when occupancy surveys are conducted over short timescales. PMID:26591459

  15. Nonmedical influences on medical decision making: an experimental technique using videotapes, factorial design, and survey sampling.

    PubMed Central

    Feldman, H A; McKinlay, J B; Potter, D A; Freund, K M; Burns, R B; Moskowitz, M A; Kasten, L E

    1997-01-01

    OBJECTIVE: To study nonmedical influences on the doctor-patient interaction. A technique using simulated patients and "real" doctors is described. DATA SOURCES: A random sample of physicians, stratified on such characteristics as demographics, specialty, or experience, and selected from commercial and professional listings. STUDY DESIGN: A medical appointment is depicted on videotape by professional actors. The patient's presenting complaint (e.g., chest pain) allows a range of valid interpretation. Several alternative versions are taped, featuring the same script with patient-actors of different age, sex, race, or other characteristics. Fractional factorial design is used to select a balanced subset of patient characteristics, reducing costs without biasing the outcome. DATA COLLECTION: Each physician is shown one version of the videotape appointment and is asked to describe how he or she would diagnose or treat such a patient. PRINCIPAL FINDINGS: Two studies using this technique have been completed to date, one involving chest pain and dyspnea and the other involving breast cancer. The factorial design provided sufficient power, despite limited sample size, to demonstrate with statistical significance various influences of the experimental and stratification variables, including the patient's gender and age and the physician's experience. Persistent recruitment produced a high response rate, minimizing selection bias and enhancing validity. CONCLUSION: These techniques permit us to determine, with a degree of control unattainable in observational studies, whether medical decisions as described by actual physicians and drawn from a demographic or professional group of interest, are influenced by a prescribed set of nonmedical factors. PMID:9240285

  16. Contextualising Water Use in Residential Settings: A Survey of Non-Intrusive Techniques and Approaches.

    PubMed

    Carboni, Davide; Gluhak, Alex; McCann, Julie A; Beach, Thomas H

    2016-01-01

    Water monitoring in households is important to ensure the sustainability of fresh water reserves on our planet. It provides stakeholders with the statistics required to formulate optimal strategies in residential water management. However, this should not be prohibitive and appliance-level water monitoring cannot practically be achieved by deploying sensors on every faucet or water-consuming device of interest due to the higher hardware costs and complexity, not to mention the risk of accidental leakages that can derive from the extra plumbing needed. Machine learning and data mining techniques are promising techniques to analyse monitored data to obtain non-intrusive water usage disaggregation. This is because they can discern water usage from the aggregated data acquired from a single point of observation. This paper provides an overview of water usage disaggregation systems and related techniques adopted for water event classification. The state-of-the art of algorithms and testbeds used for fixture recognition are reviewed and a discussion on the prominent challenges and future research are also included. PMID:27213397

  17. Contextualising Water Use in Residential Settings: A Survey of Non-Intrusive Techniques and Approaches

    PubMed Central

    Carboni, Davide; Gluhak, Alex; McCann, Julie A.; Beach, Thomas H.

    2016-01-01

    Water monitoring in households is important to ensure the sustainability of fresh water reserves on our planet. It provides stakeholders with the statistics required to formulate optimal strategies in residential water management. However, this should not be prohibitive and appliance-level water monitoring cannot practically be achieved by deploying sensors on every faucet or water-consuming device of interest due to the higher hardware costs and complexity, not to mention the risk of accidental leakages that can derive from the extra plumbing needed. Machine learning and data mining techniques are promising techniques to analyse monitored data to obtain non-intrusive water usage disaggregation. This is because they can discern water usage from the aggregated data acquired from a single point of observation. This paper provides an overview of water usage disaggregation systems and related techniques adopted for water event classification. The state-of-the art of algorithms and testbeds used for fixture recognition are reviewed and a discussion on the prominent challenges and future research are also included. PMID:27213397

  18. Survey of adult liver transplantation techniques (SALT): an international study of current practices in deceased donor liver transplantation

    PubMed Central

    Kluger, Michael D; Memeo, Riccardo; Laurent, Alexis; Tayar, Claude; Cherqui, Daniel

    2011-01-01

    Background There has been little focus lately on operative techniques for full graft liver transplantation, and the standard technique is unclear. Methods An internet survey addressing the key technical issues was e-mailed to programme directors. Results Responses were obtained from 93 out of 128 (73%) directors contacted. Programmes performed a median of 60 (8–240) transplants per year. Maximum mean cold time of 13 ± 3 h and maximum median steatosis of 40% (15–90%) were tolerated. The inferior vena cava was preserved by 48% of centres all the time and 43% selectively. European centres used temporary portacaval shunting (42%) four times more often than USA programmes. Venous bypass was always used when not preserving the inferior vena cava by less than 25%, and used selectively by approximately 40% of centres. Portal vein anastomosis with room for expansion (88%), graft hepatic artery to native gastroduodenal/common hepatic artery bifurcation (57%) and bile duct-to-duct (47%) were the favoured techniques. Discussion A standard international operative technique for deceased donor liver transplantation does not exist, although there is a trend towards inferior vena cava preservation. Donor selection criteria were more homogenous across programmes. As suggested by the high response rate, there likely exists interest to investigate technical variations on an international scale. PMID:21929669

  19. A survey of probabilistic methods used in reliability, risk and uncertainty analysis: Analytical techniques 1

    SciTech Connect

    Robinson, D.G.

    1998-06-01

    This report provides an introduction to the various probabilistic methods developed roughly between 1956--1985 for performing reliability or probabilistic uncertainty analysis on complex systems. This exposition does not include the traditional reliability methods (e.g. parallel-series systems, etc.) that might be found in the many reliability texts and reference materials (e.g. and 1977). Rather, the report centers on the relatively new, and certainly less well known across the engineering community, analytical techniques. Discussion of the analytical methods has been broken into two reports. This particular report is limited to those methods developed between 1956--1985. While a bit dated, methods described in the later portions of this report still dominate the literature and provide a necessary technical foundation for more current research. A second report (Analytical Techniques 2) addresses methods developed since 1985. The flow of this report roughly follows the historical development of the various methods so each new technique builds on the discussion of strengths and weaknesses of previous techniques. To facilitate the understanding of the various methods discussed, a simple 2-dimensional problem is used throughout the report. The problem is used for discussion purposes only; conclusions regarding the applicability and efficiency of particular methods are based on secondary analyses and a number of years of experience by the author. This document should be considered a living document in the sense that as new methods or variations of existing methods are developed, the document and references will be updated to reflect the current state of the literature as much as possible. For those scientists and engineers already familiar with these methods, the discussion will at times become rather obvious. However, the goal of this effort is to provide a common basis for future discussions and, as such, will hopefully be useful to those more intimate with

  20. Monitoring Fine-Grained Sediment in the Colorado River Ecosystem, Arizona - Control Network and Conventional Survey Techniques

    USGS Publications Warehouse

    Hazel, Joseph E., Jr.; Kaplinski, Matt; Parnell, Roderic A.; Kohl, Keith; Schmidt, John C.

    2008-01-01

    In 2002, fine-grained sediment (sand, silt, and clay) monitoring in the Colorado River downstream from Glen Canyon Dam was initiated to survey channel topography at scales previously unobtainable in this canyon setting. This report presents the methods used to establish the high-resolution global positioning system (GPS) control network required for this effort as well as the conventional surveying techniques used in the study. Using simultaneous, dual-frequency GPS vector-based methods, the network points were determined to have positioning accuracies of less than 0.03 meters (m) and ellipsoidal height accuracies of between 0.01 and 0.10 m at a 95-percent degree of confidence. We also assessed network point quality with repeated, electronic (optical) total-station observations at 39 points for a total of 362 measurements; the mean range was 0.022 m in horizontal and 0.13 in vertical at a 95-percent confidence interval. These results indicate that the control network is of sufficient spatial and vertical accuracy for collection of airborne and subaerial remote-sensing technologies and integration of these data in a geographic information system on a repeatable basis without anomalies. The monitoring methods were employed in up to 11 discrete reaches over various time intervals. The reaches varied from 1.3 to 6.4 kilometers in length. Field results from surveys in 2000, 2002, and 2004 are described, during which conventional surveying was used to collect more than 3000 points per day. Ground points were used as checkpoints and to supplement areas just below or above the water surface, where remote-sensing data are not collected or are subject to greater error. An accuracy of +or- 0.05 m was identified as the minimum precision of individual ground points. These results are important for assessing digital elevation model (DEM) quality and identifying detection limits of significant change among surfaces generated from remote-sensing technologies.

  1. Geo-Acoustic Doppler Spectroscopy: A Novel Acoustic Technique For Surveying The Seabed

    NASA Astrophysics Data System (ADS)

    Buckingham, Michael J.

    2010-09-01

    An acoustic inversion technique, known as Geo-Acoustic Doppler Spectroscopy, has recently been developed for estimating the geo-acoustic parameters of the seabed in shallow water. The technique is unusual in that it utilizes a low-flying, propeller-driven light aircraft as an acoustic source. Both the engine and propeller produce sound and, since they are rotating sources, the acoustic signature of each takes the form of a sequence of narrow-band harmonics. Although the coupling of the harmonics across the air-sea interface is inefficient, due to the large impedance mismatch between air and water, sufficient energy penetrates the sea surface to provide a useable underwater signal at sensors either in the water column or buried in the sediment. The received signals, which are significantly Doppler shifted due to the motion of the aircraft, will have experienced a number of reflections from the seabed and thus they contain information about the sediment. A geo-acoustic inversion of the Doppler-shifted modes associated with each harmonic yields an estimate of the sound speed in the sediment; and, once the sound speed has been determined, the known correlations between it and the remaining geo-acoustic parameters allow all of the latter to be computed. This inversion technique has been applied to aircraft data collected in the shallow water north of Scripps pier, returning values of the sound speed, shear speed, porosity, density and grain size that are consistent with the known properties of the sandy sediment in the channel.

  2. Novel Techniques for Survey and Classification Studies to Improve Patient Centered Websites

    PubMed Central

    Chused, Amy; Payne, Philip R.O.; Starren, Justin B.

    2006-01-01

    There is great interest in ascertaining patient perceptions in order to create more patient-friendly web resources. The recent proliferation of inexpensive web based data collection systems can facilitate this. Many quite sophisticated tools are commercially available. Unfortunately, researchers often recreate these capabilities in order to avoid privacy issues. This poster describes a simple architecture that allows use of a commercial system while maintaining privacy. In this example, the commercial tool supports the collection of complex categorical sorting data relating to chemotherapy systems. Hypothesis discovery techniques are used to convert the sort data into intuitive web menus. PMID:17238510

  3. Pilot workload and fatigue: A critical survey of concepts and assessment techniques

    NASA Technical Reports Server (NTRS)

    Gartner, W. B.; Murphy, M. R.

    1976-01-01

    The principal unresolved issues in conceptualizing and measuring pilot workload and fatigue are discussed. These issues are seen as limiting the development of more useful working concepts and techniques and their application to systems engineering and management activities. A conceptual analysis of pilot workload and fatigue, an overview and critique of approaches to the assessment of these phenomena, and a discussion of current trends in the management of unwanted workload and fatigue effects are presented. Refinements and innovations in assessment methods are recommended for enhancing the practical significance of workload and fatigue studies.

  4. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  5. New sensor and non-contact geometrical survey for the vibrating wire technique

    NASA Astrophysics Data System (ADS)

    Geraldes, Renan; Junqueira Leão, Rodrigo; Cernicchiaro, Geraldo; Terenzi Neuenschwander, Regis; Citadini, James Francisco; Droher Rodrigues, Antônio Ricardo

    2016-03-01

    The tolerances for the alignment of the magnets in the girders of the next machine of the Brazilian Synchrotron Light Laboratory (LNLS), Sirius, are as small as 40 μm for translations and 0.2 mrad for rotations. Therefore, a novel approach to the well-known vibrating wire technique has been developed and tested for the precise fiducialization of magnets. The alignment bench consists of four commercial linear stages, a stretched wire, a commercial lock-in amplifier working with phase-locked loop (PLL), a coordinate measuring machine (CMM) and a vibration sensor for the wire. This novel sensor has been designed for a larger linear region of operation. For the mechanical metrology step of the fiducialization of quadrupoles an innovative technique, using the vision system of the CMM, is presented. While the work with pitch and yaw orientations is still ongoing with promising partial results, the system already presents an uncertainty level below 10 μm for translational alignment.

  6. Testing river surveying techniques in tidal environments: example from an actively meandering channel surveyed with TLS (Mont Saint-Michel bay, France)

    NASA Astrophysics Data System (ADS)

    Leroux, J.; Lague, D.

    2013-12-01

    factor 2 during summer/autumn spring tides at the peak of pioneer vegetation development. Bank erosion and channel dynamics show a marked difference for tides reaching the salt marsh elevation. For tides below marsh elevation, bank erosion is negligible and the channel is systematically aggrading at a rate proportional to HWL. For tides flooding the marsh, mean bank erosion increases linearly with HWL and the channel shifts to erosion for over-marsh tides. Using flow velocity and SSC data we show that sedimentation on the inner bar results from the penetration of the turbid flood onto the inner bar. Spatial variability in sedimentation results from local interactions between flow and vegetation. On the contrary, bank erosion is dominated by the very large ebb peak velocity developing during spring tides. The very non-linear sensitivity to HWL of bank erosion and channel erosion means that the rate of evolution is largely controlled by the largest tides of the year. This in turn yields very large annual fluctuations in the rates of meander evolution. These results demonstrate that mega-tidal environment can offer an alternative setting to test new survey techniques aimed at river monitoring and can shed light in the elementary processes governing biogeomorphological interactions.

  7. Electrospinning as a powerful technique for biomedical applications: a critically selected survey.

    PubMed

    Villarreal-Gómez, Luis Jesús; Cornejo-Bravo, José Manuel; Vera-Graziano, Ricardo; Grande, Daniel

    2016-01-01

    Nowadays, electrospinning has become one of the most versatile, easy, and cost-effective techniques to engineer advanced materials used for many applications, especially in the biomedical and environmental areas. Like the numerous patents around the world, the increasing number of papers witnesses the huge potential of this simple process, and many companies have been emerged during the last years to exploit its innumerable applications. This article presents a critically selected overview of polymers that can be used to produce nanofibers, along with the biomedical applications of the resulting electrospun scaffolds. We have focused on about seven natural and synthetic polymers, but many more can be found in the literature, either as their pristine state or as composites with ceramics, metals, and other polymers. The description of some strategies for nanofiber production, and the characterization used to evaluate their optimization, has been discussed. Finally, several polymers have been recognized as highlights for future work. PMID:26540235

  8. A survey of routing techniques in store-and-forward and wormhole interconnects.

    SciTech Connect

    Holman, David Michael; Lee, David S.

    2008-01-01

    This paper presents an overview of algorithms for directing messages through networks of varying topology. These are commonly referred to as routing algorithms in the literature that is presented. In addition to providing background on networking terminology and router basics, the paper explains the issues of deadlock and livelock as they apply to routing. After this, there is a discussion of routing algorithms for both store-and-forward and wormhole-switched networks. The paper covers both algorithms that do and do not adapt to conditions in the network. Techniques targeting structured as well as irregular topologies are discussed. Following this, strategies for routing in the presence of faulty nodes and links in the network are described.

  9. A Survey of Measurements and Measuring Techniques in Rapidly Distorted Compressible Turbulent Boundary Layers

    NASA Technical Reports Server (NTRS)

    Fernholz, H. H.; Finley, P. J.; Dussauge, J. P.; Smits, A. J.; Reshotko, E. (Editor)

    1989-01-01

    A wide range of recent work on compressible turbulent boundary layers is described. Special attention was paid to flows with rapid changes in pressure including flows with shock waves, curved walls, and expansions. The application of rapid distortion theory to flows transversing expansion and shock waves is reviewed. This is followed by an account of experiments aimed at elucidating the large scale structures present in supersonic boundary layers. The current status of laser-Doppler and hot-wire anemometry in supersonic flow is discussed, and a new interferometric technique for the determination of wall-stress is described. The use of small pressure transducers to deduce information about the structure of zero pressure-gradient and severely perturbed boundary layers is investigated. Finally, there is an extension of the data presentation of AGARDographs 223, 253 and 263 to cover rapidly distorted boundary layers.

  10. Left atrial appendage closure-indications, techniques, and outcomes: results of the European Heart Rhythm Association Survey.

    PubMed

    Pison, Laurent; Potpara, Tatjana S; Chen, Jian; Larsen, Torben B; Bongiorni, Maria Grazia; Blomström-Lundqvist, Carina

    2015-04-01

    The purpose of this EP Wire was to assess the indications, techniques, and outcomes of left atrial appendage occlusion (LAAO) in Europe. Thirty-three European centres, all members of the European Heart Rhythm Association electrophysiology (EP) research network, responded to this survey by completing the questionnaire. The major indication for LAAO (94%) was the prevention of stroke in patients at high thrombo-embolic risk (CHA2DS2-VASc ≥ 2) and contraindications to oral anticoagulants (OACs). Twenty-one (64%) of the responding centres perform LAAO in their own institution and 80% implanted 30 or less LAAO devices in 2014. Two-dimensional transoesophageal echocardiography was the preferred imaging technique to visualize LAA before, during, and after LAAO in 79, 58, and 62% of the participating centres, respectively. Following LAAO, 49% of the centres prescribe vitamin K antagonists or novel OACs. Twenty-five per cent of the centres combine LAAO with pulmonary vein isolation. The periprocedural complications included death (range, 0-3%), ischaemic or haemorrhagic stroke (0-25%), tamponade (0-25%), and device embolization (0-20%). In conclusion, this EP Wire has demonstrated that LAAO is most commonly employed in patients at high thrombo-embolic risk in whom OAC is contraindicated. The technique is not yet very widespread and the complication rates remain significant. PMID:25833883

  11. Digital 3D Borobudur - Integration of 3D surveying and modeling techniques

    NASA Astrophysics Data System (ADS)

    Suwardhi, D.; Menna, F.; Remondino, F.; Hanke, K.; Akmalia, R.

    2015-08-01

    The Borobudur temple (Indonesia) is one of the greatest Buddhist monuments in the world, now listed as an UNESCO World Heritage Site. The present state of the temple is the result of restorations after being exposed to natural disasters several times. Today there is still a growing rate of deterioration of the building stones whose causes need further researches. Monitoring programs, supported at institutional level, have been effectively executed to observe the problem. The paper presents the latest efforts to digitally document the Borobudur Temple and its surrounding area in 3D with photogrammetric techniques. UAV and terrestrial images were acquired to completely digitize the temple, produce DEM, orthoimages and maps at 1:100 and 1:1000 scale. The results of the project are now employed by the local government organizations to manage the heritage area and plan new policies for the conservation and preservation of the UNESCO site. In order to help data management and policy makers, a web-based information system of the heritage area was also built to visualize and easily access all the data and achieved 3D results.

  12. Illumination Sufficiency Survey Techniques: In-situ Measurements of Lighting System Performance and a User Preference Survey for Illuminance in an Off-Grid, African Setting

    SciTech Connect

    Alstone, Peter; Jacobson, Arne; Mills, Evan

    2010-08-26

    Efforts to promote rechargeable electric lighting as a replacement for fuel-based light sources in developing countries are typically predicated on the notion that lighting service levels can be maintained or improved while reducing the costs and environmental impacts of existing practices. However, the extremely low incomes of those who depend on fuel-based lighting create a need to balance the hypothetically possible or desirable levels of light with those that are sufficient and affordable. In a pilot study of four night vendors in Kenya, we document a field technique we developed to simultaneously measure the effectiveness of lighting service provided by a lighting system and conduct a survey of lighting service demand by end-users. We took gridded illuminance measurements across each vendor's working and selling area, with users indicating the sufficiency of light at each point. User light sources included a mix of kerosene-fueled hurricane lanterns, pressure lamps, and LED lanterns.We observed illuminance levels ranging from just above zero to 150 lux. The LED systems markedly improved the lighting service levels over those provided by kerosene-fueled hurricane lanterns. Users reported that the minimum acceptable threshold was about 2 lux. The results also indicated that the LED lamps in use by the subjects did not always provide sufficient illumination over the desired retail areas. Our sample size is much too small, however, to reach any conclusions about requirements in the broader population. Given the small number of subjects and very specific type of user, our results should be regarded as indicative rather than conclusive. We recommend replicating the method at larger scales and across a variety of user types and contexts. Policymakers should revisit the subject of recommended illuminance levels regularly as LED technology advances and the price/service balance point evolves.

  13. A study of methods to predict and measure the transmission of sound through the walls of light aircraft. A survey of techniques for visualization of noise fields

    NASA Technical Reports Server (NTRS)

    Marshall, S. E.; Bernhard, R.

    1984-01-01

    A survey of the most widely used methods for visualizing acoustic phenomena is presented. Emphasis is placed on acoustic processes in the audible frequencies. Many visual problems are analyzed on computer graphic systems. A brief description of the current technology in computer graphics is included. The visualization technique survey will serve as basis for recommending an optimum scheme for displaying acoustic fields on computer graphic systems.

  14. Recent calving dynamics of Glaciar Jorge Montt (Southern Patagonia Icefield) based on feature tracking techniques and oceanographic surveys

    NASA Astrophysics Data System (ADS)

    Bown, F.; Moffat, C. F.; Rivera, A.; Cisternas, S.; Kohoutek, T.

    2013-12-01

    Glaciers in the Southern Patagonia Icefield (SPI) have been retreating, thinning and accelerating in recent decades. Most of the SPI is comprised of temperate ice, therefore melting is the dominant wasting factor, however, calving is also playing a very important role, especially because calving is enhancing ice dynamic responses, mainly when glaciers calve into deep waters. Some of the most exacerbated responses are connected to the well documented and long-term tidewater calving cycle (TCC) overlapped by recent climate-related glacier responses. Glaciar Jorge Montt (48S/73W), is a tidewater glacier (~500 km2) which has experienced the maximum frontal retreat of the whole SPI (near 20 km in 112 years) while retreating up to 400 m water depth. Dead trees found in areas recently open by the glacier's retreat prove a date for the previous advancing cycle which took place during the Little Ice Age (250-400 years BP). This result indicates that the glacier is experiencing the retreating phase of the TCC in centennial time-scales. However, very little is known if this phase will stop or will continue, or how do climate change dynamcis will affect it. In order to understand the present behaviour of the glacier, several surveys have recently been conducted in the area, including airborne lidar and radar surveys, water depth measurements and ice dynamic studies. In order to survey the ice dynamic of the glacier front in connection with tides at the inner fjord, a camera pointing to the glacier terminus and collecting up to 8 photographs per day was installed in April 2012. The camera was continuously working for 60 days, allowing to study in detail the ice velocities, calving fluxes and tides near the ice. Thanks to the geo-location of the oblique photographs, feature tracking techniques were applied to the series in order to determine ice velocities and frontal retreat during the operational period. The resulting average velocities are lower than 10 m d-1, which are

  15. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  16. The VIMOS Public Extragalactic Redshift Survey (VIPERS). Never mind the gaps: comparing techniques to restore homogeneous sky coverage

    NASA Astrophysics Data System (ADS)

    Cucciati, O.; Granett, B. R.; Branchini, E.; Marulli, F.; Iovino, A.; Moscardini, L.; Bel, J.; Cappi, A.; Peacock, J. A.; de la Torre, S.; Bolzonella, M.; Guzzo, L.; Polletta, M.; Fritz, A.; Adami, C.; Bottini, D.; Coupon, J.; Davidzon, I.; Franzetti, P.; Fumana, M.; Garilli, B.; Krywult, J.; Małek, K.; Paioro, L.; Pollo, A.; Scodeggio, M.; Tasca, L. A. M.; Vergani, D.; Zanichelli, A.; Di Porto, C.; Zamorani, G.

    2014-05-01

    Aims: Non-uniform sampling and gaps in sky coverage are common in galaxy redshift surveys, but these effects can degrade galaxy counts-in-cells measurements and density estimates. We carry out a comparative study of methods that aim to fill the gaps to correct for the systematic effects. Our study is motivated by the analysis of the VIMOS Public Extragalactic Redshift Survey (VIPERS), a flux-limited survey at iAB < 22.5 consisting of single-pass observations with the VLT Visible Multi-Object Spectrograph (VIMOS) with gaps representing 25% of the surveyed area and an averagesampling rate of 35%. However, our findings are generally applicable to other redshift surveys with similar observing strategies. Methods: We applied two algorithms that use photometric redshift information and assign redshifts to galaxies based upon the spectroscopic redshifts of the nearest neighbours. We compared these methods with two Bayesian methods, the Wiener filter and the Poisson-Lognormal filter. Using galaxy mock catalogues we quantified the accuracy and precision of the counts-in-cells measurements on scales of R = 5 h-1 Mpc and 8 h-1 Mpc after applying each of these methods. We further investigated how these methods perform to account for other sources of uncertainty typical of spectroscopic surveys, such as the spectroscopic redshift error and the sparse, inhomogeneous sampling rate. We analysed each of these sources separately, then all together in a mock catalogue that mimicks the full observational strategy of a VIPERS-like survey. Results: In a survey such as VIPERS, the errors in counts-in-cells measurements on R < 10 h-1 Mpc scales are dominated by the sparseness of the sample due to the single-pass observing strategy. All methods under-predict the counts in high-density regions by 20-35%, depending on the cell size, method, and underlying overdensity. This systematic bias is similar to random errors. No method outperforms the others: differences are not large, and methods

  17. Neutral Hydrogen Structures Trace Dust Polarization Angle: Implications for Cosmic Microwave Background Foregrounds

    NASA Astrophysics Data System (ADS)

    Clark, S. E.; Hill, J. Colin; Peek, J. E. G.; Putman, M. E.; Babler, B. L.

    2015-12-01

    Using high-resolution data from the Galactic Arecibo L-Band Feed Array HI (GALFA-Hi) survey, we show that linear structure in Galactic neutral hydrogen (Hi) correlates with the magnetic field orientation implied by Planck 353 GHz polarized dust emission. The structure of the neutral interstellar medium is more tightly coupled to the magnetic field than previously known. At high Galactic latitudes, where the Planck data are noise dominated, the Hi data provide an independent constraint on the Galactic magnetic field orientation, and hence the local dust polarization angle. We detect strong cross-correlations between template maps constructed from estimates of dust intensity combined with either Hi-derived angles, starlight polarization angles, or Planck 353 GHz angles. The Hi data thus provide a new tool in the search for inflationary gravitational wave B -mode polarization in the cosmic microwave background, which is currently limited by dust foreground contamination.

  18. Neutral Hydrogen Structures Trace Dust Polarization Angle: Implications for Cosmic Microwave Background Foregrounds.

    PubMed

    Clark, S E; Hill, J Colin; Peek, J E G; Putman, M E; Babler, B L

    2015-12-11

    Using high-resolution data from the Galactic Arecibo L-Band Feed Array HI (GALFA-Hi) survey, we show that linear structure in Galactic neutral hydrogen (Hi) correlates with the magnetic field orientation implied by Planck 353 GHz polarized dust emission. The structure of the neutral interstellar medium is more tightly coupled to the magnetic field than previously known. At high Galactic latitudes, where the Planck data are noise dominated, the Hi data provide an independent constraint on the Galactic magnetic field orientation, and hence the local dust polarization angle. We detect strong cross-correlations between template maps constructed from estimates of dust intensity combined with either Hi-derived angles, starlight polarization angles, or Planck 353 GHz angles. The Hi data thus provide a new tool in the search for inflationary gravitational wave B-mode polarization in the cosmic microwave background, which is currently limited by dust foreground contamination. PMID:26705622

  19. Survey of WBSNs for Pre-Hospital Assistance: Trends to Maximize the Network Lifetime and Video Transmission Techniques

    PubMed Central

    Gonzalez, Enrique; Peña, Raul; Vargas-Rosales, Cesar; Avila, Alfonso; Perez-Diaz de Cerio, David

    2015-01-01

    This survey aims to encourage the multidisciplinary communities to join forces for innovation in the mobile health monitoring area. Specifically, multidisciplinary innovations in medical emergency scenarios can have a significant impact on the effectiveness and quality of the procedures and practices in the delivery of medical care. Wireless body sensor networks (WBSNs) are a promising technology capable of improving the existing practices in condition assessment and care delivery for a patient in a medical emergency. This technology can also facilitate the early interventions of a specialist physician during the pre-hospital period. WBSNs make possible these early interventions by establishing remote communication links with video/audio support and by providing medical information such as vital signs, electrocardiograms, etc. in real time. This survey focuses on relevant issues needed to understand how to setup a WBSN for medical emergencies. These issues are: monitoring vital signs and video transmission, energy efficient protocols, scheduling, optimization and energy consumption on a WBSN. PMID:26007741

  20. Survey of WBSNs for Pre-Hospital Assistance: Trends to Maximize the Network Lifetime and Video Transmission Techniques.

    PubMed

    Gonzalez, Enrique; Peña, Raul; Vargas-Rosales, Cesar; Avila, Alfonso; de Cerio, David Perez-Diaz

    2015-01-01

    This survey aims to encourage the multidisciplinary communities to join forces for innovation in the mobile health monitoring area. Specifically, multidisciplinary innovations in medical emergency scenarios can have a significant impact on the effectiveness and quality of the procedures and practices in the delivery of medical care. Wireless body sensor networks (WBSNs) are a promising technology capable of improving the existing practices in condition assessment and care delivery for a patient in a medical emergency. This technology can also facilitate the early interventions of a specialist physician during the pre-hospital period. WBSNs make possible these early interventions by establishing remote communication links with video/audio support and by providing medical information such as vital signs, electrocardiograms, etc. in real time. This survey focuses on relevant issues needed to understand how to setup a WBSN for medical emergencies. These issues are: monitoring vital signs and video transmission, energy efficient protocols, scheduling, optimization and energy consumption on a WBSN. PMID:26007741

  1. Amazonas project: Application of remote sensing techniques for the integrated survey of natural resources in Amazonas. [Brazil

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator)

    1981-01-01

    The use of LANDSAT multispectral scanner and return beam vidicon imagery for surveying the natural resources of the Brazilian Amazonas is described. Purposes of the Amazonas development project are summarized. The application of LANDSAT imagery to identification of vegetation coverage and soil use, identification of soil types, geomorphology, and geology and highway planning is discussed. An evaluation of the worth of LANDSAT imagery in mapping the region is presented. Maps generated by the project are included.

  2. Longitudinal emittance: An introduction to the concept and survey of measurement techniques including design of a wall current monitor

    SciTech Connect

    Webber, R.C.

    1990-03-01

    The properties of charged particle beams associated with the distribution of the particles in energy and in time can be grouped together under the category of longitudinal emittance. This article is intended to provide an intuitive introduction to the concepts longitudinal emittance; to provide an incomplete survey of methods used to measure this emittance and the related properties of bunch length and momentum spread; and to describe the detailed design of a 6 Ghz bandwidth resistive wall current monitor useful for measuring bunch shapes of moderate to high intensity beams. Overall, the article is intended to be broad in scope, in most cases deferring details to cited original papers. 37 refs., 21 figs.

  3. Applications of the five-hole probe technique for flow field surveys at the Institute for Aerospace Research

    NASA Astrophysics Data System (ADS)

    Ohman, L. H.; Nguyen, V. D.

    1994-07-01

    This paper deals with calibrations and uses of the five-hole probes for flow field survey. Two applications are given: one in transonic regime in the near slipstream of a powered propfan mounted on a half-model wing configuration and the other behind a generic submarine model at subsonic speeds. The acquired data have been analyzed in terms of flow angles, total and dynamic pressures and Mach number and velocity vector in a probe fixed coordinate system. These parameters were necessary in determining the flow field characteristics of the studied configurations which are presented and discussed.

  4. Survey of Human Operator Modeling Techniques for Measurement Applications. Final Report for Period April 1976-December 1977.

    ERIC Educational Resources Information Center

    Knoop, Patricia A.

    The purpose of this study was to review existing human operator modeling techniques and evaluate their potential utility for performance measurement applications (e.g., to support the type of flight simulation research that entails accounting for the perception and utilization of various cues). The major human operator characteristics that ought…

  5. Web-based, mobile-device friendly, self-report survey system incorporating avatars and gaming console techniques

    PubMed Central

    Savel, Craig; Mierzwa, Stan; Gorbach, Pamina; Lally, Michelle; Zimet, Gregory; Meyer, Kristin; Souidi, Samir; Interventions, AIDS

    2014-01-01

    We describe building an avatar-based self-report data collection tool to be used for a specific HIV prevention research project that is evaluating the feasibility and acceptability of this novel approach to collect self-reported data among youth. We discuss the gathering of requirements, the process of building a prototype of the envisioned system, and the lessons learned during the development of the solution. Specific knowledge is shared regarding technical experience with software development technologies and possible avenues for changes that could be considered if such a self-report survey system is used again. Examples of other gaming and avatar technology systems are included to provide further background. PMID:25422726

  6. Web-based, mobile-device friendly, self-report survey system incorporating avatars and gaming console techniques.

    PubMed

    Savel, Craig; Mierzwa, Stan; Gorbach, Pamina; Lally, Michelle; Zimet, Gregory; Meyer, Kristin; Souidi, Samir; Interventions, Aids

    2014-01-01

    We describe building an avatar-based self-report data collection tool to be used for a specific HIV prevention research project that is evaluating the feasibility and acceptability of this novel approach to collect self-reported data among youth. We discuss the gathering of requirements, the process of building a prototype of the envisioned system, and the lessons learned during the development of the solution. Specific knowledge is shared regarding technical experience with software development technologies and possible avenues for changes that could be considered if such a self-report survey system is used again. Examples of other gaming and avatar technology systems are included to provide further background. PMID:25422726

  7. Peering through the OH forest: a new technique to remove residual sky features from Sloan Digital Sky Survey spectra

    NASA Astrophysics Data System (ADS)

    Wild, Vivienne; Hewett, Paul C.

    2005-04-01

    The Sloan Digital Sky Survey (SDSS) currently provides by far the largest homogeneous sample of intermediate signal-to-noise (S/N) ratio optical spectra of galaxies and quasars. The fully automated SDSS spectroscopic reduction pipeline has provided spectra of unprecedented quality that cover the wavelength range 3800-9200Å. However, in common with spectra from virtually all multi-object surveys employing fibres, there remain significant systematic residuals in many of the spectra owing to the incomplete subtraction of the strong OH sky emission lines longward of 6700Å. These sky lines affect almost half the wavelength range of the SDSS spectra, and the S/N ratio over substantial wavelength regions in many spectra is reduced by more than a factor of 2 over that expected from counting statistics. We present a method to automatically remove the sky-residual signal, using a principal component analysis which takes advantage of the correlation in the form of the sky-subtraction residuals present in each spectrum. Application of the method results in spectra with essentially no evidence for degradation owing to the incomplete subtraction of OH emission features. A dramatic improvement in the quality of a substantial number of spectra, particularly those of faint objects such as the bulk of the high-redshift quasars, is achieved. We make available Interactive Data Language (IDL) code and documentation to implement the sky-residual subtraction scheme on SDSS spectra included in the public data releases. To ensure that absorption and emission features intrinsic to the object spectra do not affect the subtraction procedure, line masks must be created that depend on the scientific application of interest. We illustrate the power of the sky-residual subtraction scheme using samples of SDSS galaxy and quasar spectra, presenting tests involving the near-infrared CaII triplet absorption, metal absorption line features in damped Lyman-α systems and composite spectra of high

  8. Age Determination by Back Length for African Savanna Elephants: Extending Age Assessment Techniques for Aerial-Based Surveys

    PubMed Central

    Trimble, Morgan J.; van Aarde, Rudi J.; Ferreira, Sam M.; Nørgaard, Camilla F.; Fourie, Johan; Lee, Phyllis C.; Moss, Cynthia J.

    2011-01-01

    Determining the age of individuals in a population can lead to a better understanding of population dynamics through age structure analysis and estimation of age-specific fecundity and survival rates. Shoulder height has been used to accurately assign age to free-ranging African savanna elephants. However, back length may provide an analog measurable in aerial-based surveys. We assessed the relationship between back length and age for known-age elephants in Amboseli National Park, Kenya, and Addo Elephant National Park, South Africa. We also compared age- and sex-specific back lengths between these populations and compared adult female back lengths across 11 widely dispersed populations in five African countries. Sex-specific Von Bertalanffy growth curves provided a good fit to the back length data of known-age individuals. Based on back length, accurate ages could be assigned relatively precisely for females up to 23 years of age and males up to 17. The female back length curve allowed more precise age assignment to older females than the curve for shoulder height does, probably because of divergence between the respective growth curves. However, this did not appear to be the case for males, but the sample of known-age males was limited to ≤27 years. Age- and sex-specific back lengths were similar in Amboseli National Park and Addo Elephant National Park. Furthermore, while adult female back lengths in the three Zambian populations were generally shorter than in other populations, back lengths in the remaining eight populations did not differ significantly, in support of claims that growth patterns of African savanna elephants are similar over wide geographic regions. Thus, the growth curves presented here should allow researchers to use aerial-based surveys to assign ages to elephants with greater precision than previously possible and, therefore, to estimate population variables. PMID:22028925

  9. Application of capillary gas chromatography mass spectrometry/computer techniques to synoptic survey of organic material in bed sediment

    USGS Publications Warehouse

    Steinheimer, T.R.; Pereira, W.E.; Johnson, S.M.

    1981-01-01

    A bed sediment sample taken from an area impacted by heavy industrial activity was analyzed for organic compounds of environmental significance. Extraction was effected on a Soxhlet apparatus using a freeze-dried sample. The Soxhlet extract was fractionated by silica gel micro-column adsorption chromatography. Separation and identification of the organic compounds was accomplished by capillary gas chromatography/mass spectrometry techniques. More than 50 compounds were identified; these include saturated hydrocarbons, olefins, aromatic hydrocarbons, alkylated polycyclic aromatic hydrocarbons, and oxygenated compounds such as aldehydes and ketones. The role of bed sediments as a source or sink for organic pollutants is discussed. ?? 1981.

  10. Prioritizing the Compensation Mechanisms for Nurses Working in Emergency Department of Hospital Using Fuzzy DEMATEL Technique: A Survey from Iran

    PubMed Central

    Mamikhani, Jahanara; Tofighi, Shahram; Sadeghifar, Jamil; Heydari, Majied; Jenab, Vahied Hosseini

    2014-01-01

    Aim and Background: Nursing professionals are the most important human resources that provide care in the Emergency Departments at hospitals. Therefore appropriate compensation for the services provided by them is considered as a priority. This study aims to identify and prioritize the factors affecting the compensation for services provided by the EDs nurses. Methods: Twenty four nurses, hospital administrators, local and national health authorities participated in a cross sectional study conducted in 2012. The participants discussed on different compensation mechanisms for nurses’ of EDs, in six groups according to Focus Group Discussion (FGD) technique, resulted in the adopted mechanisms. Opinions of the participants on the mechanisms were obtained via paired matrices using fuzzy logic. Decision Making Trial and Evaluation Laboratory (DEMATEL) technique was used for prioritizing the adopted mechanisms. Findings: Among the compensation mechanisms for nurses of ED services, both Monthly fixed amounts (9.0382) and increasing the number of vacation days (9.0189) had highest importance. The lowest importance was given to the performance-based payment (8.9897). Monthly fixed amounts, increasing the number of vacation days, decreasing the working hours and performance-based payment were recognized as effective factors. Other mechanisms are prioritized as use of the facilities, increase in emergency tariff, job promotions, non-cash payments, continuing education, and persuasive years. Conclusion: According to the results, the nurses working in the EDS of the hospitals were more likely to receive non-cash benefits than cash benefits as compensation. PMID:24576368

  11. Directional-cosine and related pre-processing techniques - Possibilities and problems in earth-resources surveys

    NASA Technical Reports Server (NTRS)

    Quiel, F.

    1975-01-01

    The possibilities of using various pre-processing techniques (directional-cosine, ratios and ratio/sum) have been investigated in relation to an urban land-use problem in Marion County, Indiana (USA) and for geologic applications in the San Juan Mountains of Colorado. For Marion County, it proved possible to classify directional-cosine data from September 1972 into different land uses by applying statistics developed with data from a May 1973 ERTS frame, thereby demonstrating the possibilities of using this type of data for signature-extension purposes. In the Silverton (Colorado) area pre-processed data proved superior to original data when extracting useful information in mountainous areas without corresponding ground observations. This approach allowed meaningful classification and interpretation of the data. The main problems encountered as a result of atmospheric effects, mixing of different surface materials, and the performance characteristics of ERTS are elucidated.

  12. Direct, immunological and molecular techniques for a fasciolosis survey in a rural area of San Luis, Argentina.

    PubMed

    Carnevale, Silvana; Cabrera, Marta Graciela; Cucher, Marcela Alejandra; di Risio, Cecilia Alicia; Malandrini, Jorge Bruno; Kamenetzky, Laura; Alazraqui, Marcio; Etchart, Cristina Beatriz; Pantano, María Laura; Velásquez, Jorge Néstor

    2013-10-01

    Fasciolosis is a zoonosis caused by the trematode Fasciola hepatica, prevalent in cattle, that is actually emerging as a cause of disease in humans. The goal of this work was to describe the characteristics of fasciolosis in arroyo El Juncal region, La Toma, San Luis province, Argentina. In order to get this objective, a transversal, quantitative study was carried out by a fieldwork that allowed the collection of data, human, animal, and environmental samples. The materials were processed by direct, immunological and/or molecular diagnostic techniques. According to the geographical characteristics and in presence of all the definitive and intermediate hosts, reservoirs, and sources of infection, it was possible to describe the persistence of fasciolosis in the area. The prevalence was 11.90 % in humans (by serology), 5.26 % in cattle (by coprological analysis) and 61.76 % in snails (by PCR). The situation that was found for this area indicates that any measure of intervention for the control of this zoonosis should be adopted by multidisciplinary teams. PMID:24431579

  13. Behavioral Risk Profile of Men Who Have Sex with Men in Beijing, China: Results from a Cross-sectional Survey with Randomized Response Techniques

    PubMed Central

    Geng, Guo-Zhu; Gao, Ge; Ruan, Yu-Hua; Yu, Ming-Run; Zhou, Yun-Hua

    2016-01-01

    Background: Human immunodeficiency virus (HIV) is spreading rapidly among men who have sex with men (MSM) in China. Anonymous questionnaires or direct interviews have been frequently used to study their behavior. The aim of the study was to describe the behavioral risk profile of the MSM in Beijing using the randomized response techniques (RRTs). Methods: A cross-sectional survey of sexual behavior among a sample of MSM was conducted in two HIV counseling and testing clinics in Beijing. The survey was carried out with an anonymous questionnaire containing sensitive questions on sexual behavior. To obtain the honest responses to the sensitive questions, three distinctive RRTs were used in the questionnaire: (1) Additive randomized response model for quantitative questions, (2) randomized response model for multiple choice questions, and (3) Simmons randomized response model for binomial questions. Formulae for the point estimate, variance, and confidence interval (CI) were provided for each specific model. Results: Using RRTs in a sample of 659 participants, the mean age at first homosexual encounter was estimated to be 21.7 years (95% CI: 21.2–22.2), and each had sex with about three (2.9, 95% CI: 2.4–3.4) male partners on average in the past month. The estimated rate for consistent condom use was 56.4% (95% CI: 50.1–62.8%). In addition, condom was estimated to be used among 80.0% (95% CI: 74.1–85.9%) of the population during last anal sex with a male partner. Conclusions: Our study employed RRTs in a survey containing questions on sexual behavior among MSM, and the results showed that RRT might be a useful tool to obtain truthful feedback on sensitive information such as sexual behavior from the respondents, especially in traditional Chinese cultural settings. PMID:26904985

  14. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role.

    PubMed

    Kleftogiannis, Dimitrios; Korfiati, Aigli; Theofilatos, Konstantinos; Likothanassis, Spiros; Tsakalidis, Athanasios; Mavroudi, Seferina

    2013-06-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. PMID:23501016

  15. Predator Presence and Vegetation Density Affect Capture Rates and Detectability of Litoria aurea Tadpoles: Wide-Ranging Implications for a Common Survey Technique

    PubMed Central

    Sanders, Madeleine R.; Clulow, Simon; Bower, Deborah S.; Clulow, John; Mahony, Michael J.

    2015-01-01

    Trapping is a common sampling technique used to estimate fundamental population metrics of animal species such as abundance, survival and distribution. However, capture success for any trapping method can be heavily influenced by individuals’ behavioural plasticity, which in turn affects the accuracy of any population estimates derived from the data. Funnel trapping is one of the most common methods for sampling aquatic vertebrates, although, apart from fish studies, almost nothing is known about the effects of behavioural plasticity on trapping success. We used a full factorial experiment to investigate the effects that two common environmental parameters (predator presence and vegetation density) have on the trapping success of tadpoles. We estimated that the odds of tadpoles being captured in traps was 4.3 times higher when predators were absent compared to present and 2.1 times higher when vegetation density was high compared to low, using odds ratios based on fitted model means. The odds of tadpoles being detected in traps were also 2.9 times higher in predator-free environments. These results indicate that common environmental factors can trigger behavioural plasticity in tadpoles that biases trapping success. We issue a warning to researchers and surveyors that trapping biases may be commonplace when conducting surveys such as these, and urge caution in interpreting data without consideration of important environmental factors present in the study system. Left unconsidered, trapping biases in capture success have the potential to lead to incorrect interpretations of data sets, and misdirection of limited resources for managing species. PMID:26605923

  16. Dutch Young Adults Ratings of Behavior Change Techniques Applied in Mobile Phone Apps to Promote Physical Activity: A Cross-Sectional Survey

    PubMed Central

    Belmon, Laura S; te Velde, Saskia J; Brug, Johannes

    2015-01-01

    Background Interventions delivered through new device technology, including mobile phone apps, appear to be an effective method to reach young adults. Previous research indicates that self-efficacy and social support for physical activity and self-regulation behavior change techniques (BCT), such as goal setting, feedback, and self-monitoring, are important for promoting physical activity; however, little is known about evaluations by the target population of BCTs applied to physical activity apps and whether these preferences are associated with individual personality characteristics. Objective This study aimed to explore young adults’ opinions regarding BCTs (including self-regulation techniques) applied in mobile phone physical activity apps, and to examine associations between personality characteristics and ratings of BCTs applied in physical activity apps. Methods We conducted a cross-sectional online survey among healthy 18 to 30-year-old adults (N=179). Data on participants’ gender, age, height, weight, current education level, living situation, mobile phone use, personality traits, exercise self-efficacy, exercise self-identity, total physical activity level, and whether participants met Dutch physical activity guidelines were collected. Items for rating BCTs applied in physical activity apps were selected from a hierarchical taxonomy for BCTs, and were clustered into three BCT categories according to factor analysis: “goal setting and goal reviewing,” “feedback and self-monitoring,” and “social support and social comparison.” Results Most participants were female (n=146), highly educated (n=169), physically active, and had high levels of self-efficacy. In general, we observed high ratings of BCTs aimed to increase “goal setting and goal reviewing” and “feedback and self-monitoring,” but not for BCTs addressing “social support and social comparison.” Only 3 (out of 16 tested) significant associations between personality

  17. A survey of surveys

    SciTech Connect

    Kent, S.M.

    1994-11-01

    A new era for the field of Galactic structure is about to be opened with the advent of wide-area digital sky surveys. In this article, the author reviews the status and prospects for research for 3 new ground-based surveys: the Sloan Digital Sky Survey (SDSS), the Deep Near-Infrared Survey of the Southern Sky (DENIS) and the Two Micron AU Sky Survey (2MASS). These surveys will permit detailed studies of Galactic structure and stellar populations in the Galaxy with unprecedented detail. Extracting the information, however, will be challenging.

  18. Surveying Future Surveys

    NASA Astrophysics Data System (ADS)

    Carlstrom, John E.

    2016-06-01

    The now standard model of cosmology has been tested and refined by the analysis of increasingly sensitive, large astronomical surveys, especially with statistically significant millimeter-wave surveys of the cosmic microwave background and optical surveys of the distribution of galaxies. This talk will offer a glimpse of the future, which promises an acceleration of this trend with cosmological information coming from new surveys across the electromagnetic spectrum as well as particles and even gravitational waves.

  19. Integrated Technologies for Surveying Artefacts Damaged by Earthquakes. Application of All-In LIDAR Techniques in the City of L'AQUILA

    NASA Astrophysics Data System (ADS)

    Clini, P.; Quattrini, R.; Fiori, F.; Nespeca, R.

    2013-07-01

    The purpose of this work is to demonstrate how, in post-earthquake intervention scenarios, the latest "all-in-one" laser technologies employed beyond their usual applications and integrated in more traditional survey methods, can define a comprehensive and original approach method in response to surveying issues, safety of the artefacts, speed and low cost of surveys, quality of data and of the models provided for damage assessments and any required action. The case study of L'Aquila is therefore significant. The red area has essentially two types of buildings: monuments and historical buildings characterised by compact urban centres. Here we document the convent of the Blessed Antonia and the Antenucci Block, as case studies and synthesis of the two types and ideal laboratories to test the chosen method. In the first case, we document the project on a building that is yet to be secured and that therefore presents delicate issues in terms of survey speed and completeness, also in relation to the precious decorations that it holds. In the other case, we document the survey of the typical block in Aquila, already secured which, given the size and complexity, requires an integrated approach, more complex and more time-consuming of methods of analysis.

  20. A critical review of field techniques employed in the survey of large woody debris in river corridors: a central European perspective.

    PubMed

    Máčka, Zdeněk; Krejčí, Lukáš; Loučková, Blanka; Peterková, Lucie

    2011-10-01

    In forested watersheds, large woody debris (LWD) is an integral component of river channels and floodplains. Fallen trees have a significant impact on physical and ecological processes in fluvial ecosystems. An enormous body of literature concerning LWD in river corridors is currently available. However, synthesis and statistical treatment of the published data are hampered by the heterogeneity of methodological approaches. Likewise, the precision and accuracy of data arising out of published surveys have yet to be assessed. For this review, a literature scrutiny of 100 randomly selected research papers was made to examine the most frequently surveyed LWD variables and field procedures. Some 29 variables arose for individual LWD pieces, and 15 variables for wood accumulations. The literature survey revealed a large variability in field procedures for LWD surveys. In many studies (32), description of field procedure proved less than adequate, rendering the results impossible to reproduce in comparable fashion by other researchers. This contribution identifies the main methodological problems and sources of error associated with the mapping and measurement of the most frequently surveyed variables of LWD, both as individual pieces and in accumulations. The discussion stems from our own field experience with LWD survey in river systems of various geomorphic styles and types of riparian vegetation in the Czech Republic in the 2004-10 period. We modelled variability in terms of LWD number, volume, and biomass for three geomorphologically contrasting river systems. The results appeared to be sensitive, in the main, to sampling strategy and prevailing field conditions; less variability was produced by errors of measurement. Finally, we propose a comprehensive standard field procedure for LWD surveyors, including a total of 20 variables describing spatial position, structural characteristics and the functions and dynamics of LWD. However, resources are only rarely

  1. Improving the Response Rate to a Street Survey: An Evaluation of the "But You Are Free to Accept or to Refuse" Technique.

    ERIC Educational Resources Information Center

    Gueguen, Nicolas; Pascual, Alexandre

    2005-01-01

    The "but you are free to accept or to refuse" technique is a compliance procedure in which someone is approached with a request by simply telling him/her that he/she is free to accept or to refuse the request. This semantic evocation leads to increased compliance with the request. Furthermore, in most of the studies in which this technique was…

  2. Spatial Techniques

    NASA Astrophysics Data System (ADS)

    Jabeur, Nafaa; Sahli, Nabil

    The environment, including the Earth and the immense space, is recognized to be the main source of useful information for human beings. During several decades, the acquisition of data from this environment was constrained by tools and techniques with limited capabilities. However, thanks to continuous technological advances,spatial data are available in huge quantities for different applications. The technological advances have been achieved in terms of hardware and software as well. They are allowing for better accuracy and availability, which in turn improves the quality and quantity of useful knowledge that can be extracted from the environment. They have been applied to geography, resulting in geospatial techniques. Applied to both science and technology, geospatial techniques resulted in areas of expertise, such as land surveying, cartography, navigation, remote sensing, Geographic Infor-mation Systems (GISs), and Global Positioning Systems (GPSs). They had evolved quickly with advances in computing, satellite technology and a growing demand to understand our global environment. In this chapter, we will discuss three important techniques that are widely used in spatial data acquisition and analysis: GPS and remote sensing techniques that are used to collect spatial data and a GIS that is used to store, manipulate, analyze, and visualize spatial data. Later in this book, we will discuss the techniques that are currently available for spatial knowledge discovery.

  3. Results of preconstruction surveys used as a management technique for conserving endangered species and their habitats on Naval Petroleum Reserve No. 1 (Elk Hills), Kern County, California

    SciTech Connect

    Kato, T.T.; O'Farrell, T.P.; Johnson, J.W.

    1985-08-01

    In 1976 an intensive program of petroleum production at maximum efficient rate was initiated on the US Department of Energy's (DOE) Naval Petroleum Reserve No. 1 (Elk Hills) in western Kern County, California. In a Biological Opinion required by the Endangered Species Act, the US Fish and Wildlife Service concluded that proposed construction and production activities may jeopardize the continued existence of the endangered San Joaquin kit fox, Vulpes macrotis mutica, and the blunt-nosed leopard lizard, Gambelia silus, inhabiting the Reserve. DOE committed itself to carrying out a compensation/mitigation plan to offset impacts of program activities on endangered species and their habitats. One compensation/mitigation strategy was to develop and implement preconstruction surveys to assess potential conflicts between proposed construction activities, and endangered species and their critical habitats, and to propose reasonable and prudent alternatives to avoid conflicts. Between 1980 and 1984, preconstruction surveys were completed for 296 of a total of 387 major construction projects encompassing 3590 acres. Fewer than 22% of the projects potentially conflicted with conservation of endangered species, and most conflicts were easily resolved by identifying sensitive areas that required protection. Only 8% of the projects received minor modification in their design or locations to satisfy conservation needs, and only three projects had to be completely relocated. No projects were cancelled or delayed because of conflicts with endangered species, and costs to conduct preconstruction surveys were minimal. 27 refs., 9 figs., 2 tabs.

  4. Optimizing end-to-end system performance for millimeter and submillimeter spectroscopy of protostars : wideband heterodyne receivers and sideband-deconvolution techniques for rapid molecular-line surveys

    NASA Astrophysics Data System (ADS)

    Sumner, Matthew Casey

    This thesis describes the construction, integration, and use of a new 230-GHz ultra-wideband heterodyne receiver, as well as the development and testing of a new sideband-deconvolution algorithm, both designed to enable rapid, sensitive molecular-line surveys. The 230-GHz receiver, known as Z-Rex, is the first of a new generation of wideband receivers to be installed at the Caltech Submillimeter Observatory (CSO). Intended as a proof-of-concept device, it boasts an ultra-wide IF output range of sim 6 - 18 GHz, offering as much as a twelvefold increase in the spectral coverage that can be achieved with a single LO setting. A similarly wideband IF system has been designed to couple this receiver to an array of WASP2 spectrometers, allowing the full bandwidth of the receiver to be observed at low resolution, ideal for extra-galactic redshift surveys. A separate IF system feeds a high-resolution 4-GHz AOS array frequently used for performing unbiased line surveys of galactic objects, particularly star-forming regions. The design and construction of the wideband IF system are presented, as is the work done to integrate the receiver and the high-resolution spectrometers into a working system. The receiver is currently installed at the CSO where it is available for astronomers' use. In addition to demonstrating wideband design principles, the receiver also serves as a testbed for a synthesizer-driven, active LO chain that is under consideration for future receiver designs. Several lessons have been learned, including the importance of driving the final amplifier of the LO chain into saturation and the absolute necessity of including a high-Q filter to remove spurious signals from the synthesizer output. The on-telescope performance of the synthesizer-driven LO chain is compared to that of the Gunn-oscillator units currently in use at the CSO. Although the frequency agility of the synthesized LO chain gives it a significant advantage for unbiased line surveys, the cleaner

  5. Neonatal extracorporeal membrane oxygenation devices, techniques and team roles: 2011 survey results of the United States' Extracorporeal Life Support Organization centers.

    PubMed

    Lawson, Scott; Ellis, Cory; Butler, Katie; McRobb, Craig; Mejak, Brian

    2011-12-01

    In early 2011, surveys of active Extracorporeal Life Support Organization (ELSO) centers within the United States were conducted by electronic mail regarding neonatal Extracorporeal Membrane Oxygenation (ECMO) equipment and professional staff. Seventy-four of 111 (67%) U.S. centers listed in the ELSO directory as neonatal centers responded to the survey. Of the responding centers, 53% routinely used roller pumps for neonatal ECMO, 15% reported using centrifugal pumps and 32% reported using a combination of both. Of the centers using centrifugal pumps, 51% reported that they do not use a compliance bladder in the circuit. The majority (95%) of roller pump users reported using a compliance bladder and 97% reported using Tygon" S-97-E tubing in the raceway of their ECMO circuits. Silicone membrane oxygenators were reportedly used by 25% of the respondents, 5% reported using micro-porous hollow fiber oxygenators (MPHF), 70% reported using polymethylpentene (PMP) hollow fiber oxygenators and 5% reported using a combination of the different types. Some form of in-line blood monitoring was used by 88% of the responding centers and 63% of responding centers reported using a circuit surface coating. Anticoagulation monitoring via the activated clotting time (ACT) was reported by 100% of the reporting centers. The use of extracorporeal cardiopulmonary resuscitation (ECPR) was reported by 53% of the responding centers with 82% of those centers using a crystalloid primed circuit to initiate ECPR. A cooling protocol was used by 77% of the centers which have an ECPR program. When these data are compared with surveys from 2002 and 2008 it shows that the use of silicone membrane oxygenators continues to decline, the use of centrifugal pumps continues to increase and ECMO personnel continues to be comprised of multidisciplinary groups of dedicated allied health care professionals. PMID:22416604

  6. Neonatal Extracorporeal Membrane Oxygenation Devices, Techniques and Team Roles: 2011 Survey Results of the United States’ Extracorporeal Life Support Organization Centers

    PubMed Central

    Lawson, Scott; Ellis, Cory; Butler, Katie; McRobb, Craig; Mejak, Brian

    2011-01-01

    Abstract: In early 2011, surveys of active Extracorporeal Life Support Organization (ELSO) centers within the United States were conducted by electronic mail regarding neonatal Extracorporeal Membrane Oxygenation (ECMO) equipment and professional staff. Seventy-four of 111 (67%) U.S. centers listed in the ELSO directory as neonatal centers responded to the survey. Of the responding centers, 53% routinely used roller pumps for neonatal ECMO, 15% reported using centrifugal pumps and 32% reported using a combination of both. Of the centers using centrifugal pumps, 51% reported that they do not use a compliance bladder in the circuit. The majority (95%) of roller pump users reported using a compliance bladder and 97% reported using Tygon® S-97-E tubing in the raceway of their ECMO circuits. Silicone membrane oxygenators were reportedly used by 25% of the respondents, 5% reported using micro-porous hollow fiber oxygenators (MPHF), 70%reported using polymethylpentene (PMP) hollow fiber oxygenators and 5% reported using a combination of the different types. Some form of in-line blood monitoring was used by 88% of the responding centers and 63% of responding centers reported using a circuit surface coating. Anticoagulation monitoring via the activated clotting time (ACT) was reported by 100% of the reporting centers. The use of extracorporeal cardiopulmonary resuscitation (ECPR) was reported by 53% of the responding centers with 82% of those centers using a crystalloid primed circuit to initiate ECPR. A cooling protocol was used by 77% of the centers which have an ECPR program. When these data are compared with surveys from 2002 and 2008 it shows that the use of silicone membrane oxygenators continues to decline, the use of centrifugal pumps continues to increase and ECMO personnel continues to be comprised of multidisciplinary groups of dedicated allied health care professionals. PMID:22416604

  7. Survey and evaluation of instream habitat and stock restoration techniques for wild pink and chum salmon. Restoration study number 105-1 (restoration project 93063). Exxon Valdez oil spill state/federal natural resource damage assessment final report

    SciTech Connect

    Willette, T.M.; Dudiak, N.C.; Honnold, S.G.; Carpenter, G.; Dickson, M.

    1995-08-01

    This project is the result of a three-year survey of the Exxon Valdez oil spill impact area to identify appropriate and cost-effective instream habitat restoration techniques for salmon, including spawning channels and improvement of fish passage through fish ladders or step-pool structures to overcome physical or hydrological barriers. Additional wild salmon stock rehabilitation measures include stream-side incubation boxes, remote egg-taking, incubation at existing hatcheries for fry stocking in oil-impacted streams, and fry rearing. Study results include the identification of the most promising instream habitat restoration projects in each of the spill-impacted areas.

  8. Knowledge based systems: A critical survey of major concepts, issues, and techniques. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry presents a detailed survey of knowledge based systems. After being in a relatively dormant state for many years, only recently is Artificial Intelligence (AI) - that branch of computer science that attempts to have machines emulate intelligent behavior - accomplishing practical results. Most of these results can be attributed to the design and use of Knowledge-Based Systems, KBSs (or ecpert systems) - problem solving computer programs that can reach a level of performance comparable to that of a human expert in some specialized problem domain. These systems can act as a consultant for various requirements like medical diagnosis, military threat analysis, project risk assessment, etc. These systems possess knowledge to enable them to make intelligent desisions. They are, however, not meant to replace the human specialists in any particular domain. A critical survey of recent work in interactive KBSs is reported. A case study (MYCIN) of a KBS, a list of existing KBSs, and an introduction to the Japanese Fifth Generation Computer Project are provided as appendices. Finally, an extensive set of KBS-related references is provided at the end of the report.

  9. Ye Olde Maile Surveye.

    ERIC Educational Resources Information Center

    Berty, Ernest

    This publication is primarily designed for educational practitioners who possess little or no training in conducting mail surveys or have not kept current on the present state of the art of survey methods and techniques. It is also intended to be a checking and comparing aid to ensure that important research considerations are taken into account.…

  10. Techniques for computer-aided analysis of ERTS-1 data, useful in geologic, forest and water resource surveys. [Colorado Rocky Mountains

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1974-01-01

    Forestry, geology, and water resource applications were the focus of this study, which involved the use of computer-implemented pattern-recognition techniques to analyze ERTS-1 data. The results have proven the value of computer-aided analysis techniques, even in areas of mountainous terrain. Several analysis capabilities have been developed during these ERTS-1 investigations. A procedure to rotate, deskew, and geometrically scale the MSS data results in 1:24,000 scale printouts that can be directly overlayed on 7 1/2 minutes U.S.G.S. topographic maps. Several scales of computer-enhanced "false color-infrared" composites of MSS data can be obtained from a digital display unit, and emphasize the tremendous detail present in the ERTS-1 data. A grid can also be superimposed on the displayed data to aid in specifying areas of interest.

  11. Repeated stool sampling and use of multiple techniques enhance the sensitivity of helminth diagnosis: a cross-sectional survey in southern Lao People's Democratic Republic.

    PubMed

    Sayasone, Somphou; Utzinger, Jürg; Akkhavong, Kongsap; Odermatt, Peter

    2015-01-01

    Intestinal parasitic infections are common in Lao People's Democratic Republic (Lao PDR). We investigated the accuracy of the Kato-Katz (KK) technique in relation to varying stool sampling efforts, and determined the effect of the concurrent use of a quantitative formalin-ethyl acetate concentration technique (FECT) for helminth diagnosis and appraisal of concomitant infections. The study was carried out between March and May 2006 in Champasack province, southern Lao PDR. Overall, 485 individuals aged ≥6 months who provided three stool samples were included in the final analysis. All stool samples were subjected to the KK technique. Additionally, one stool sample per individual was processed by FECT. Diagnosis was done under a light microscope by experienced laboratory technicians. Analysis of three stool samples with KK plus a single FECT was considered as diagnostic 'gold' standard and resulted in prevalence estimates of hookworm, Opisthorchis viverrini, Ascaris lumbricoides, Trichuris trichiura and Schistosoma mekongi infection of 77.9%, 65.0%, 33.4%, 26.2% and 24.3%, respectively. As expected, a single KK and a single FECT missed a considerable number of infections. While our diagnostic 'gold' standard produced similar results than those obtained by a mathematical model for most helminth infections, the 'true' prevalence predicted by the model for S. mekongi (28.1%) was somewhat higher than after multiple KK plus a single FECT (24.3%). In the current setting, triplicate KK plus a single FECT diagnosed helminth infections with high sensitivity. Hence, such a diagnostic approach might be utilised for generating high-quality baseline data, assessing anthelminthic drug efficacy and rigorous monitoring of community interventions. PMID:25225157

  12. Investigating a damaging buried sinkhole cluster in an urban area (Zaragoza city, NE Spain) integrating multiple techniques: Geomorphological surveys, DInSAR, DEMs, GPR, ERT, and trenching

    NASA Astrophysics Data System (ADS)

    Carbonel, Domingo; Rodríguez-Tribaldos, Verónica; Gutiérrez, Francisco; Galve, Jorge Pedro; Guerrero, Jesús; Zarroca, Mario; Roqué, Carles; Linares, Rogelio; McCalpin, James P.; Acosta, Enrique

    2015-01-01

    This contribution analyses a complex sinkhole cluster buried by urban elements in the mantled evaporite karst of Zaragoza city, NE Spain, where active subsidence has caused significant economic losses (~ 0.3 million Euro). The investigation, conducted after the development of the area, has involved the application of multiple surface and subsurface techniques. A detailed map of modern surface deformation indicates two active coalescing sinkholes, whereas the interpretation of old aerial photographs reveals the presence of two additional dormant sinkholes beneath human structures that might reactivate in the near future. DInSAR (Differential Interferometry Synthetic Aperture Radar) displacement data have limited spatial coverage mainly due to high subsidence rates and surface changes (re-pavement), and the Electrical Resistivity Tomography (ERT) and trenching investigations were severely restricted by the presence of urban elements. Nonetheless, the three techniques consistently indicate that the area affected by subsidence is larger than that defined by surface deformation features. The performance of the Ground Penetrating Radar (GPR) technique was adversely affected by the presence of highly conductive and massive anthropogenic deposits, but some profiles reveal that subsidence in the central sector of one of the sinkholes is mainly accommodated by sagging. The stratigraphic and structural relationships observed in a trench dug across the topographic margin of one of the sinkholes may be alternatively interpreted by three collapse events of around 0.6 m that occurred after 290 yr BP, or by progressive fault displacement combined with episodic anthropogenic excavation and fill. Average subsidence rates of > 6.6 mm/yr and 40 mm/yr have been calculated using stratigraphic markers dated by the radiocarbon method and historical information, respectively. This case study illustrates the need of conducting thorough investigations in sinkhole areas during the pre

  13. Integrated non-invasive remote-sensing techniques and field survey for the geoarchaeological study of the Sud Lípez mining district, Bolivia

    NASA Astrophysics Data System (ADS)

    Deroin, Jean-Paul; Téreygeol, Florian; Cruz, Pablo; Guillot, Ivan; Méaudre, Jean-Charles

    2012-08-01

    New investigations have been carried out in the framework of a joint French-Argentine project dealing with the mineral resources and the metal production in the Andean plateau from the 10th to the 18th century. Geoarchaeology of the Sud Lípez, southern Bolivia, is revisited using multisource remote-sensing data including archive data from the 1960s and recent very high resolution (VHR) data simultaneously acquired with field work. The detailed geological mapping of the area is allowed by the field survey complemented by the multispectral and VHR data. The emphasis is on integrating all the geological features such as morphologies, petrology of the volcanics, lithology of the volcano-sedimentary rocks, regional and local faulting, veins, hydrothermally altered rocks, etc. GeoEye-1, which features the most advanced technology ever used in a civilian remote-sensing system, allows the detailed mapping of the archaeological remains that are particularly numerous at San Antonio de Lípez, with shallow pits, shafts connected in depth with adits, and large slag areas. Particularly, the plan of three old miners' villages has been drawn and its accuracy has been evaluated.

  14. Application of electromagnetic techniques in survey of contaminated groundwater at an abandoned mine complex in southwestern Indiana, U.S.A.

    USGS Publications Warehouse

    Brooks, G.A.; Olyphant, G.A.; Harper, D.

    1991-01-01

    In part of a large abandoned mining complex, electromagnetic geophysical surveys were used along with data derived from cores and monitoring wells to infer sources of contamination and subsurface hydrologic connections between acidic refuse deposits and adjacent undisturbed geologic materials. Electrical resistivity increases sharply along the boundary of an elevated deposit of pyritic coarse refuse, which is highly contaminated and electrically conductive, indicating poor subsurface hydrologic connections with surrounding deposits of fine refuse and undisturbed glacial material. Groundwater chemistry, as reflected in values of specific conductance, also differs markedly across the deposit's boundary, indicating that a widespread contaminant plume has not developed around the coarse refuse in more than 40 yr since the deposit was created. Most acidic drainage from the coarse refuse is by surface runoff and is concentrated around stream channels. Although most of the contaminated groundwater within the study area is concentrated within the surficial refuse deposits, transects of apparent resistivity and phase angle indicate the existence of an anomalous conductive layer at depth (>4 m) in thick alluvial sediments along the northern boundary of the mining complex. Based on knowledge of local geology, the anomaly is interpreted to represent a subsurface connection between the alluvium and a flooded abandoned underground mine. ?? 1991 Springer-Verlag New York Inc.

  15. The OPD photometric survey of open clusters I. Techniques, program details and first results of robust determination of the fundamental parameters

    NASA Astrophysics Data System (ADS)

    Caetano, T. C.; Dias, W. S.; Lépine, J. R. D.; Monteiro, H. S.; Moitinho, A.; Hickel, G. R.; Oliveira, A. F.

    2015-07-01

    Open clusters are considered valuable objects for the investigation of galactic structure and dynamics since their distances, ages and velocities can be determined with good precision. According to the New Catalog of Optically Visible Open Clusters and Candidates (Dias et al., 2002) about 10% of the optically revealed open clusters remain unstudied. However, previous analysis (Moitinho, 2010) has indicated that not considering this unstudied population introduces significant biases in the study of the structure and evolution of the Milky Way. In addition, a systematic revision of the data contained in the catalog, collected from the literature, is needed, due to its inhomogeneity. In this first paper of a series, we present the observational strategy, data reduction and analysis procedures of a UBRVI photometric survey of southern open star clusters carried out at Pico dos Dias Observatory (Brazil). The aim of the program is to contribute to an unbiased, homogenous collection of cluster fundamental parameters. We show that the implementation of a sequence of systematic procedures considerably improves the quality of the results. To illustrate the methods we present the first results based on one night of observations. The parameters, reddening, distance, age and metallicity, were obtained by fitting theoretical isochrones to cluster color-color and multidimensional color-magnitude diagrams, applying a cross-entropy optimization algorithm developed by our group, which takes into account UBVRI photometric data weighted using a membership-likelihood estimation.

  16. Development of sensors for ceramic components in advanced propulsion systems: Survey and evaluation of measurement techniques for temperature, strain and heat flux for ceramic components in advanced propulsion systems

    NASA Technical Reports Server (NTRS)

    Atkinson, W. H.; Cyr, M. A.; Strange, R. R.

    1988-01-01

    The report presents the final results of Tasks 1 and 2, Development of Sensors for Ceramic Components in Advanced Propulsion Systems (NASA program NAS3-25141). During Task 1, an extensive survey was conducted of sensor concepts which have the potential for measuring surface temperature, strain and heat flux on ceramic components for advanced propulsion systems. Each sensor concept was analyzed and evaluated under Task 2; sensor concepts were then recommended for further development. For temperature measurement, both pyrometry and thermographic phosphors are recommended for measurements up to and beyond the melting point of ceramic materials. For lower temperature test programs, the thin-film techniques offer advantages in the installation of temperature sensors. Optical strain measurement techniques are recommended because they offer the possibility of being useful at very high temperature levels. Techniques for the measurement of heat flux are recommended for development based on both a surface mounted sensor and the measurement of the temperature differential across a portion of a ceramic component or metallic substrate.

  17. "Dark" Atomic Gas in the Diffuse Interstellar Medium

    NASA Astrophysics Data System (ADS)

    Reach, William T.; Heiles, Carl; Bernard, Jean-Philippe

    2015-08-01

    Far-infrared and gamma-ray surveys indicate there are significantly more nucleons in the diffuse interstellar medium than are traced by HI and CO emission. We are using the Arecibo Observatory to complement Planck observations, testing hypotheses for the origin of "dark gas" associated with the far-infrared and gamma rays. The "dark gas" is really the far-infrared emission in excess over what can be explained by dust mixed with atomic gas traced by the 21-cm line in the GALFA survey. First we test the hypothesis that the excess is molecular gas, by measuring OH absorption toward selected radio sources. Next, we are observing HI absorption, because cold atomic gas is optically thick and does not emit as readily in the 21-cm line, but it can be seen in absorption against radio continuum sources. We will observe radio sources near clouds with far-infrared emission measured by Planck to be in excess of the high-resolution HI observations from the Arecibo GALFA HI survey. We will also test another hypothesis that the ”dark gas” is molecular by observing OH absorption toward the brightest sources.

  18. Advances in Bio-Tactile Sensors for Minimally Invasive Surgery Using the Fibre Bragg Grating Force Sensor Technique:A Survey

    PubMed Central

    Abushagur, Abdulfatah A.G.; Arsad, Norhana; Ibne Reaz, Mamun; Ashrif, A.; Bakar, A.

    2014-01-01

    The large interest in utilising fibre Bragg grating (FBG) strain sensors for minimally invasive surgery (MIS) applications to replace conventional electrical tactile sensors has grown in the past few years. FBG strain sensors offer the advantages of optical fibre sensors, such as high sensitivity, immunity to electromagnetic noise, electrical passivity and chemical inertness, but are not limited by phase discontinuity or intensity fluctuations. FBG sensors feature a wavelength-encoding sensing signal that enables distributed sensing that utilises fewer connections. In addition, their flexibility and lightness allow easy insertion into needles and catheters, thus enabling localised measurements inside tissues and blood. Two types of FBG tactile sensors have been emphasised in the literature: single-point and array FBG tactile sensors. This paper describes the current design, development and research of the optical fibre tactile techniques that are based on FBGs to enhance the performance of MIS procedures in general. Providing MIS or microsurgery surgeons with accurate and precise measurements and control of the contact forces during tissues manipulation will benefit both surgeons and patients. PMID:24721774

  19. Survey/Feedback. Basic School PR Guide.

    ERIC Educational Resources Information Center

    Banach, William J.

    To help improve school public relations programs, this handbook tells how to use survey and feedback techniques and how to interpret survey results. The first chapter gives a brief overview of the usefulness of surveys for getting community feedback. Chapter 2 recommends beginning by deciding what one wants to know from a survey, what human and…

  20. "Suntelligence" Survey

    MedlinePlus

    ... to the American Academy of Dermatology's "Suntelligence" sun-smart survey. Please answer the following questions to measure ... be able to view a ranking of major cities suntelligence based on residents' responses to this survey. ...

  1. Hidden Galactic Accretion: The Discovery of Low-Velocity Halo Clouds

    NASA Astrophysics Data System (ADS)

    Peek, J. E. G.; Putman, M. E.; Sommer-Larsen, J.; Heiles, C. E.; Stanimirovic, S.; Douglas, K.; Gibson, S.; Korpela, E.

    2007-12-01

    High-Velocity Clouds (HVCs) have been thought to be part of the Galactic accretion process since their discovery more than 40 years ago. Two modes through which HVCs may be generated and contribute to the ongoing growth of our Galaxy are (1) the tidal stripping of satellite galaxies and (2) the fragmented condensation of the Galaxy's hot baryonic halo. We have run cosmological Tree-SPH simulations of a Milky-Way sized galaxy, in which we can resolve clouds down to 10^5 M⊙, in an attempt to probe the cooling halo accretion process. The simulations show that this HVC generation mechanism can indeed reproduce the characteristics of observed population of HVCs, including the flux, velocity and cloud clustering properties. These simulations also predict an equally large population of halo clouds moving at lower radial velocities: Low-Velocity Halo Clouds (LVHCs). These clouds would not be observed as HVCs, but would rather be confused with local disk gas. Taking advantage of the known empirical result that HVCs have undetectably low infrared dust flux compared to their 21cm column, we search for these clouds in the preliminary GALFA-HI survey and IRAS. We announce the discovery of the first examples of these clouds, and describe their properties. This work was supported in part by NSF grant AST 04-06987 and NSF grant AST 07-09347.

  2. A Survey of Dimension Reduction Techniques

    SciTech Connect

    Fodor, I K

    2002-05-09

    Advances in data collection and storage capabilities during the past decades have led to an information overload in most sciences. Researchers working in domains as diverse as engineering, astronomy, biology, remote sensing, economics, and consumer transactions, face larger and larger observations and simulations on a daily basis. Such datasets, in contrast with smaller, more traditional datasets that have been studied extensively in the past, present new challenges in data analysis. Traditional statistical methods break down partly because of the increase in the number of observations, but mostly because of the increase in the number of variables associated with each observation. The dimension of the data, is the number of variables that are measured on each observation. High-dimensional datasets present many mathematical challenges as well as some opportunities, and are bound to give rise to new theoretical developments. One of the problems with high-dimensional datasets is that, in many cases, not all the measured variables are ''important'' for understanding the underlying phenomena of interest. While certain computationally expensive novel methods can construct predictive models with high accuracy from high-dimensional data, it is still of interest in many applications to reduce the dimension of the original data prior to any modeling of the data. In this paper, we described several dimension reduction methods.

  3. A survey of laser lightning rod techniques

    NASA Technical Reports Server (NTRS)

    Barnes, Arnold A., Jr.; Berthel, Robert O.

    1991-01-01

    The work done to create a laser lightning rod (LLR) is discussed. Some ongoing research which has the potential for achieving an operational laser lightning rod for use in the protection of missile launch sites, launch vehicles, and other property is discussed. Because of the ease with which a laser beam can be steered into any cloud overhead, an LLR could be used to ascertain if there exists enough charge in the clouds to discharge to the ground as triggered lightning. This leads to the possibility of using LLRs to test clouds prior to launching missiles through the clouds or prior to flying aircraft through the clouds. LLRs could also be used to probe and discharge clouds before or during any hazardous ground operations. Thus, an operational LLR may be able to both detect such sub-critical electrical fields and effectively neutralize them.

  4. Remote Raman measurement techniques

    NASA Astrophysics Data System (ADS)

    Leonard, D. A.

    1981-02-01

    The use of laser Raman measurement techniques in remote sensing applications is surveyed. A feasibility index is defined as a means to characterize the practicality of a given remote Raman measurement application. Specific applications of Raman scattering to the measurement of atmospheric water vapor profiles, methane plumes from liquid natural gas spills, and subsurface ocean temperature profiles are described. This paper will survey the use of laser Raman measurement techniques in remote sensing applications using as examples specific systems that the Computer Genetics Corporation (CGC) group has developed and engineered.

  5. Remote Raman measurement techniques

    NASA Technical Reports Server (NTRS)

    Leonard, D. A.

    1981-01-01

    The use of laser Raman measurement techniques in remote sensing applications is surveyed. A feasibility index is defined as a means to characterize the practicality of a given remote Raman measurement application. Specific applications of Raman scattering to the measurement of atmospheric water vapor profiles, methane plumes from liquid natural gas spills, and subsurface ocean temperature profiles are described. This paper will survey the use of laser Raman measurement techniques in remote sensing applications using as examples specific systems that the Computer Genetics Corporation (CGC) group has developed and engineered.

  6. Remote Raman Measurement Techniques

    NASA Astrophysics Data System (ADS)

    Leonard, Donald A.

    1981-02-01

    The use of laser Raman measurement techniques in remote sensing applications is surveyed. A feasibility index is defined as a means to characterize the practicality of a given remote Raman measurement application. Specific applications of Raman scattering to the measurement of atmospheric water vapor profiles, methane plumes from liquid natural gas spills, and subsurface ocean temperature profiles are described. This paper will survey the use of laser Raman measurement techniques in remote sensing applications using as examples specific systems that the Computer Genetics Corporation (CGC) group has developed and engineered.

  7. Survey Says

    ERIC Educational Resources Information Center

    McCarthy, Susan K.

    2005-01-01

    Survey Says is a lesson plan designed to teach college students how to access Internet resources for valid data related to the sexual health of young people. Discussion questions based on the most recent available data from two national surveys, the Youth Risk Behavior Surveillance-United States, 2003 (CDC, 2004) and the National Survey of…

  8. Guidelines for Consumers of Survey Research.

    ERIC Educational Resources Information Center

    Brown, Darine F.; And Others

    1981-01-01

    School counselors need to understand survey methodology even if they do not use it as research practitioners. Counselors can be better consumers of survey research by noting response rates, sampling techniques, reliability and validity in research articles. (JAC)

  9. Theory Survey or Survey Theory?

    ERIC Educational Resources Information Center

    Dean, Jodi

    2010-01-01

    Matthew Moore's survey of political theorists in U.S. American colleges and universities is an impressive contribution to political science (Moore 2010). It is the first such survey of political theory as a subfield, the response rate is very high, and the answers to the survey questions provide new information about how political theorists look…

  10. Conducting the Survey

    ERIC Educational Resources Information Center

    Ritter, Lois A., Ed.; Sue, Valerie M., Ed.

    2007-01-01

    Research regarding the optimal fielding of online surveys is in its infancy and just beginning to offer clear suggestions for effective recruiting of participants as well as techniques for maximizing the response rate. In this article, the authors discuss the process of recruiting participants by e-mailing invitations to a list of recipients…

  11. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  12. Interpretation Techniques Development

    NASA Technical Reports Server (NTRS)

    Alford, W. L.

    1973-01-01

    The processes, algorithms and procedures for extraction and interpretation of ERTS-1 data are discussed. Analysis of data acquired temporally is possible through geometric correction, correlation, and registration techniques. The powerful techniques in image enhancement developed for the lunar and planetary programs are valuable for Earth Resources Survey programs. There is evidence that both optical and digital methods of spatial information extraction can provide valuable sources of data information the ERTS system. The techniques available, even for a limited number of bands and limited resolution can be effectively used to extract much of the information required by resource managers.

  13. Redshift surveys

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.; Huchra, J. P.

    1991-01-01

    Present-day understanding of the large-scale galaxy distribution is reviewed. The statistics of the CfA redshift survey are briefly discussed. The need for deeper surveys to clarify the issues raised by recent studies of large-scale galactic distribution is addressed.

  14. SURVEY INSTRUMENT

    DOEpatents

    Borkowski, C J

    1954-01-19

    This pulse-type survey instrument is suitable for readily detecting {alpha} particles in the presence of high {beta} and {gamma} backgrounds. The instruments may also be used to survey for neutrons, {beta} particles and {gamma} rays by employing suitably designed interchangeable probes and selecting an operating potential to correspond to the particular probe.

  15. The Dark Energy Survey

    SciTech Connect

    Flaugher, Brenna; /Fermilab

    2004-11-01

    Dark Energy is the dominant constituent of the universe and they have little understanding of it. They describe a new project aimed at measuring the dark energy equation of state parameter, w, to a statistical precision of {approx} 5%, with four separate techniques. The survey will image 5000 deg{sup 2} in the southern sky and collect 300 million galaxies, 30,000 galaxy clusters, and 2000 Type Ia supernovae. The survey will be carried out using a new 3 deg{sup 2} mosaic camera mounted at the prime focus of the 4m Blanco telescope at CTIO.

  16. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  17. Terminating Sequential Delphi Survey Data Collection

    ERIC Educational Resources Information Center

    Kalaian, Sema A.; Kasim, Rafa M.

    2012-01-01

    The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey…

  18. Surveying System

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Sunrise Geodetic Surveys are setting up their equipment for a town survey. Their equipment differs from conventional surveying systems that employ transit rod and chain to measure angles and distances. They are using ISTAC Inc.'s Model 2002 positioning system, which offers fast accurate surveying with exceptional signals from orbiting satellites. The special utility of the ISTAC Model 2002 is that it can provide positioning of the highest accuracy from Navstar PPS signals because it requires no knowledge of secret codes. It operates by comparing the frequency and time phase of a Navstar signal arriving at one ISTAC receiver with the reception of the same set of signals by another receiver. Data is computer processed and translated into three dimensional position data - latitude, longitude and elevation.

  19. Survey of digital filtering

    NASA Technical Reports Server (NTRS)

    Nagle, H. T., Jr.

    1972-01-01

    A three part survey is made of the state-of-the-art in digital filtering. Part one presents background material including sampled data transformations and the discrete Fourier transform. Part two, digital filter theory, gives an in-depth coverage of filter categories, transfer function synthesis, quantization and other nonlinear errors, filter structures and computer aided design. Part three presents hardware mechanization techniques. Implementations by general purpose, mini-, and special-purpose computers are presented.

  20. Optimal systems of geoscience surveying A preliminary discussion

    NASA Astrophysics Data System (ADS)

    Shoji, Tetsuya

    2006-10-01

    In any geoscience survey, each survey technique must be effectively applied, and many techniques are often combined optimally. An important task is to get necessary and sufficient information to meet the requirement of the survey. A prize-penalty function quantifies effectiveness of the survey, and hence can be used to determine the best survey technique. On the other hand, an information-cost function can be used to determine the optimal combination of survey techniques on the basis of the geoinformation obtained. Entropy is available to evaluate geoinformation. A simple model suggests the possibility that low-resolvability techniques are generally applied at early stages of survey, and that higher-resolvability techniques should alternate with lower-resolvability ones with the progress of the survey.

  1. Dismantling techniques

    SciTech Connect

    Wiese, E.

    1998-03-13

    Most of the dismantling techniques used in a Decontamination and Dismantlement (D and D) project are taken from conventional demolition practices. Some modifications to the techniques are made to limit exposure to the workers or to lessen the spread of contamination to the work area. When working on a D and D project, it is best to keep the dismantling techniques and tools as simple as possible. The workers will be more efficient and safer using techniques that are familiar to them. Prior experience with the technique or use of mock-ups is the best way to keep workers safe and to keep the project on schedule.

  2. Complexity Survey.

    ERIC Educational Resources Information Center

    Gordon, Sandra L.; Anderson, Beth C.

    To determine whether consensus existed among teachers about the complexity of common classroom materials, a survey was administered to 66 pre-service and in-service kindergarten and prekindergarten teachers. Participants were asked to rate 14 common classroom materials as simple, complex, or super-complex. Simple materials have one obvious part,…

  3. Drug Survey.

    ERIC Educational Resources Information Center

    Gill, Wanda E.; And Others

    Results of a survey of student perceptions of drugs and drug use that was conducted at Bowie State College are presented. Studies that have been conducted on college students' use of alcohol, marijuana, and cocaine in the last five years are reviewed, along with additional studies relating to the general population and the following drugs:…

  4. The MST Radar Technique

    NASA Technical Reports Server (NTRS)

    Roettger, J.

    1984-01-01

    The coherent radar technique is reviewed with special emphasis to mesosphere-stratosphere-troposphere (MST) radars operating in the VHF band. Some basic introduction to Doppler radar measurements and the radar equation is followed by an outline of the characteristics of atmospheric turbulence, viewed from the scattering and reflection processes of radar signals. Radar signal acquisition and preprocessing, namely coherent detection, digital sampling, pre-integration and coding, is briefly discussed. The data analysis is represented in terms of the correlation and spectrum analysis, yielding the essential parameters: power, signal-to-noise ratio, average and fluctuating velocity and persistency. The techniques to measure wind velocities, viz. the different modes of the Doppler method as well as the space antenna method are surveyed and the feasibilities of the MST radar interferometer technique are elucidated. A general view on the criteria to design phased array antennas is given. An outline of the hardware of a typical MST radar system is presented.

  5. Utilization of Marketing Techniques in California Community Colleges.

    ERIC Educational Resources Information Center

    Gregory, Judi A.

    A survey of the 107 California community colleges was conducted during Spring 1980 to assess the extent to which college administrators had adopted marketing techniques. The survey instrument listed 31 such techniques under six general categories: marketing surveys, direct advertising, public information, high school recruiting, community…

  6. Mass spectrometry. [review of techniques

    NASA Technical Reports Server (NTRS)

    Burlingame, A. L.; Kimble, B. J.; Derrick, P. J.

    1976-01-01

    Advances in mass spectrometry (MS) and its applications over the past decade are reviewed in depth, with annotated literature references. New instrumentation and techniques surveyed include: modulated-beam MS, chromatographic MS on-line computer techniques, digital computer-compatible quadrupole MS, selected ion monitoring (mass fragmentography), and computer-aided management of MS data and interpretation. Areas of application surveyed include: organic MS and electron impact MS, field ionization kinetics, appearance potentials, translational energy release, studies of metastable species, photoionization, calculations of molecular orbitals, chemical kinetics, field desorption MS, high pressure MS, ion cyclotron resonance, biochemistry, medical/clinical chemistry, pharmacology, and environmental chemistry and pollution studies.

  7. A supplement to "Methods for collection and analysis of aquatic biological and microbiological samples" (U.S. Geological Survey Techniques of Water-Resources Investigations, Book 5, Chapter A4)

    USGS Publications Warehouse

    1979-01-01

    The manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. It supplements ' Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples ' (TWRI, Book 5, Chapter A4, 1977, edited by P. E. Greeson, T. A. Ehlke, G. A. Irwin, B. W. Lium, and K. V. Slack). Included are 5 new methods, a new section of selected taxonomic references for Ostracoda, and 6 revised methods.

  8. Multiple Surveys of Students and Survey Fatigue

    ERIC Educational Resources Information Center

    Porter, Stephen R.; Whitcomb, Michael E.; Weitzer, William H.

    2004-01-01

    This chapter reviews the literature on survey fatigue and summarizes a research project that indicates that administering multiple surveys in one academic year can significantly suppress response rates in later surveys. (Contains 4 tables.)

  9. Extragalactic counterparts to Einstein slew survey sources

    NASA Technical Reports Server (NTRS)

    Schachter, Jonathan F.; Elvis, Martin; Plummer, David; Remillard, Ron

    1992-01-01

    The Einstein slew survey consists of 819 bright X-ray sources, of which 636 (or 78 percent) are identified with counterparts in standard catalogs. The importance of bright X-ray surveys is stressed, and the slew survey is compared to the Rosat all sky survey. Statistical techniques for minimizing confusion in arcminute error circles in digitized data are discussed. The 238 slew survey active galactic nuclei, clusters, and BL Lacertae objects identified to date and their implications for logN-logS and source evolution studies are described.

  10. Stapedectomy technique.

    PubMed

    House, J W

    1993-06-01

    This article reviews the evolution of the author's stapedectomy technique from total footplate removal with single loop wire prosthesis and Gelfoam seal to small fenestra stapedectomy with platinum ribbon piston prosthesis and blood seal. The author concludes that the microdrill is effective, safe, and cost effective for performing this procedure. Since using this technique, the author has had no cases of sensorineural hearing loss and few complaints of dizziness or vertigo. PMID:8341570