NASA Astrophysics Data System (ADS)
Chang, Chia-Hao; Chu, Tzu-How
2017-04-01
To control the rice production and farm usage in Taiwan, Agriculture and Food Agency (AFA) has published a series of policies to subsidize farmers to plant different crops or to practice fallow science 1983. Because of no efficient and examinable mechanism to verify the fallow fields surveyed by township office, illegal fallow fields were still repeated each year. In this research, we used remote sensing images, GIS data of Fields, and application records of fallow fields to establish an illegal fallow fields detecting method in Yulin County in central Taiwan. This method included: 1. collected multi-temporal images from FS-2 or SPOT series with 4 time periods; 2. combined the application records and GIS data of fields to verify the location of fallow fields; 3. conducted ground truth survey and classified images with ISODATA and Maximum Likelihood Classification (MLC); 4. defined the land cover type of fallow fields by zonal statistic; 5. verified accuracy with ground truth; 6. developed potential illegal fallow fields survey method and benefit estimation. We use 190 fallow fields with 127 legal and 63 illegal as ground truth and accuracies of illegal fallow field interpretation in producer and user are 71.43% and 38.46%. If township office surveyed 117 classified illegal fallow fields, 45 of 63 illegal fallow fields will be detected. By using our method, township office can save 38.42% of the manpower to detect illegal fallow fields and receive an examinable 71.43% producer accuracy.
NASA Astrophysics Data System (ADS)
Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash
2017-10-01
Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.
Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash
2017-10-01
Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.
A comparative analysis of mail and internet surveys
Benjamin D. Poole; David K. Loomis
2010-01-01
Th e field of survey research is constantly evolving with the introduction of new technologies. Each new mini-revolution brings criticism about the accuracy of the new survey method. The latest development in the survey research field has been increased reliance on Internet surveys. This paper compares data collected through a mixed-mode (mail and Internet) survey of...
Field manual for the collection of Navajo Nation streamflow-gage data
Hart, Robert J.; Fisk, Gregory G.
2014-01-01
The Field Manual for the Collection of Navajo Nation Streamflow-Gage Data (Navajo Field Manual) is based on established (standard) U.S. Geological Survey streamflow-gaging methods and provides guidelines specifically designed for the Navajo Department of Water Resources personnel who establish and maintain streamflow gages. The Navajo Field Manual addresses field visits, including essential field equipment and the selection of and routine visits to streamflow-gaging stations, examines surveying methods for determining peak flows (indirect measurements), discusses safety considerations, and defines basic terms.
Compilation of field methods used in geochemical prospecting by the U.S. Geological Survey
Lakin, Hubert William; Ward, Frederick Norville; Almond, Hy
1952-01-01
The field methods described in this report are those currently used in geochemical prospecting by the U. S. Geological Survey. Some have been published, others are being processed for publication, while others are still being investigated. The purpose in compiling these methods is to make them readily available in convenient form. The methods have not been thoroughly tested and none is wholly satisfactory. Research is being continued.
Manuals Used in the National Aquatic Resource Surveys
Various manuals are used to communicate the methods and guidelines for the National Aquatic Resource Surveys. The Field Operations Manual: outlines the field protocols that crews will utilize to sample sites.
Enhancing Field Research Methods with Mobile Survey Technology
ERIC Educational Resources Information Center
Glass, Michael R.
2015-01-01
This paper assesses the experience of undergraduate students using mobile devices and a commercial application, iSurvey, to conduct a neighborhood survey. Mobile devices offer benefits for enhancing student learning and engagement. This field exercise created the opportunity for classroom discussions on the practicalities of urban research, the…
NASA Technical Reports Server (NTRS)
Marshall, S. E.; Bernhard, R.
1984-01-01
A survey of the most widely used methods for visualizing acoustic phenomena is presented. Emphasis is placed on acoustic processes in the audible frequencies. Many visual problems are analyzed on computer graphic systems. A brief description of the current technology in computer graphics is included. The visualization technique survey will serve as basis for recommending an optimum scheme for displaying acoustic fields on computer graphic systems.
Azil, Aishah H; Ritchie, Scott A; Williams, Craig R
2015-10-01
This qualitative study aimed to describe field worker perceptions, evaluations of worth, and time costs of routine dengue vector surveillance methods in Cairns (Australia), Kuala Lumpur and Petaling District (Malaysia). In Cairns, the BG-Sentinel trap is a favored method for field workers because of its user-friendliness, but is not as cost-efficient as the sticky ovitrap. In Kuala Lumpur, the Mosquito Larvae Trapping Device is perceived as a solution for the inaccessibility of premises to larval surveys. Nonetheless, the larval survey method is retained in Malaysia for prompt detection of dengue vectors. For dengue vector surveillance to be successful, there needs to be not only technical, quantitative evaluations of method performance but also an appreciation of how amenable field workers are to using particular methods. Here, we report novel field worker perceptions of dengue vector surveillance methods in addition to time analysis for each method. © 2014 APJPH.
Innovative techniques with multi-purpose survey vehicle for automated analysis of cross-slope data.
DOT National Transportation Integrated Search
2007-11-02
Manual surveying methods have long been used in the field of highway engineering to determine : the cross-slope, and longitudinal grade of an existing roadway. However, these methods are : slow, tedious and labor intensive. Moreover, manual survey me...
van Velthoven, Michelle Helena; Li, Ye; Wang, Wei; Du, Xiaozhen; Wu, Qiong; Chen, Li; Majeed, Azeem; Rudan, Igor; Zhang, Yanfeng; Car, Josip
2013-01-01
Background We set up a collaboration between researchers in China and the UK that aimed to explore the use of mHealth in China. This is the first paper in a series of papers on a large mHealth project part of this collaboration. This paper included the aims and objectives of the mHealth project, our field site, and the detailed methods of two studies. Field site The field site for this mHealth project was Zhao County, which lies 280 km south of Beijing in Hebei Province, China. Methods We described the methodology of two studies: (i) a mixed methods study exploring factors influencing sample size calculations for mHealth–based health surveys and (ii) a cross–over study determining validity of an mHealth text messaging data collection tool. The first study used mixed methods, both quantitative and qualitative, including: (i) two surveys with caregivers of young children, (ii) interviews with caregivers, village doctors and participants of the cross–over study, and (iii) researchers’ views. We combined data from caregivers, village doctors and researchers to provide an in–depth understanding of factors influencing sample size calculations for mHealth–based health surveys. The second study, a cross–over study, used a randomised cross–over study design to compare the traditional face–to–face survey method to the new text messaging survey method. We assessed data equivalence (intrarater agreement), the amount of information in responses, reasons for giving different responses, the response rate, characteristics of non–responders, and the error rate. Conclusions This paper described the objectives, field site and methods of a large mHealth project part of a collaboration between researchers in China and the UK. The mixed methods study evaluating factors that influence sample size calculations could help future studies with estimating reliable sample sizes. The cross–over study comparing face–to–face and text message survey data collection could help future studies with developing their mHealth tools. PMID:24363919
1995 American travel survey : an overview of the survey design and methodology
DOT National Transportation Integrated Search
1995-01-01
This paper describes the methods used in the 1995 ATS. The introduction provides an overview of : the purpose and objectives of the survey followed by a description of the survey and sample designs, survey field operations, and processing of survey d...
Survey Research: Methods, Issues and the Future
ERIC Educational Resources Information Center
Brewer, Ernest W.; Torrisi-Steele, Geraldine; Wang, Victor C. X.
2015-01-01
Survey research is prevalent among many professional fields. Both cost effective and time efficient, this method of research is commonly used for the purposes of gaining insight into the attitudes, thoughts, and opinions of populations. Additionally, because there are several types of survey research designs and data collection instruments, the…
Field operating experience in locating and re-recovering landslide-damaged oil wells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, R.J.
1974-01-01
Landslides have damaged 65 oil wells on Getty Oil Co.'s leases in the Ventura Avenue field. During a landslide, some wells may remain connected to the surface while other wells may be buried. Well damage ranges from slight bending to complete severing of all casing strings, and depth of damage varies from 15 to 120 ft. Two major problems have been encountered when repair work is planned for a landslide damaged well. The first problem is to locate the undamaged well casing below the landslide. The second problem is to recover the well and replace the damaged casing. Some methodsmore » used by Getty Oil Co. to locate the undamaged portion of a well are conventional surveys, dipneedle surveys, magnetometer surveys, kink-meter surveys, and test holes. Three methods have been used to recover landslide damaged wells in the Ventura Avenue field. The simplest method is an open excavation made with standard earthmoving equipment. This method is limited to shallow depths and locations where the landslide would not be reactivated. To reach greater depths, special methods such as hand-dug or machine-dug shafts must be used. All 3 methods have been used successfully by Getty Oil Co.« less
NASA Astrophysics Data System (ADS)
Talvik, Silja; Oja, Tõnis; Ellmann, Artu; Jürgenson, Harli
2014-05-01
Gravity field models in a regional scale are needed for a number of applications, for example national geoid computation, processing of precise levelling data and geological modelling. Thus the methods applied for modelling the gravity field from surveyed gravimetric information need to be considered carefully. The influence of using different gridding methods, the inclusion of unit or realistic weights and indirect gridding of free air anomalies (FAA) are investigated in the study. Known gridding methods such as kriging (KRIG), least squares collocation (LSCO), continuous curvature (CCUR) and optimal Delaunay triangulation (ODET) are used for production of gridded gravity field surfaces. As the quality of data collected varies considerably depending on the methods and instruments available or used in surveying it is important to somehow weigh the input data. This puts additional demands on data maintenance as accuracy information needs to be available for each data point participating in the modelling which is complicated by older gravity datasets where the uncertainties of not only gravity values but also supplementary information such as survey point position are not always known very accurately. A number of gravity field applications (e.g. geoid computation) demand foran FAA model, the acquisition of which is also investigated. Instead of direct gridding it could be more appropriate to proceed with indirect FAA modelling using a Bouguer anomaly grid to reduce the effect of topography on the resulting FAA model (e.g. near terraced landforms). The inclusion of different gridding methods, weights and indirect FAA modelling helps to improve gravity field modelling methods. It becomes possible to estimate the impact of varying methodical approaches on the gravity field modelling as statistical output is compared. Such knowledge helps assess the accuracy of gravity field models and their effect on the aforementioned applications.
Environmental DNA as a Tool for Inventory and Monitoring of Aquatic Vertebrates
2017-07-01
geomorphic calculations and description of each reach. Methods Channel Surveys We initially selected reaches based on access and visual indicators...WA 99164 I-2 Environmental DNA lab protocol: designing species-specific qPCR assays Species-specific surveys should use quantitative polymerase...to traditional field sampling with respect to sensitivity, detection probabilities, and cost efficiency. Compared to field surveys , eDNA sampling
Sabo, Samantha; Allen, Caitlin G; Sutkowi, Katherine; Wennerstrom, Ashley
2017-12-01
Community health workers (CHWs) are members of a growing profession in the United States. Studying this dynamic labor force is challenging, in part because its members have more than 100 different job titles. The demand for timely, accurate information about CHWs is increasing as the profession gains recognition for its ability to improve health outcomes and reduce costs. Although numerous surveys of CHWs have been conducted, the field lacks well-delineated methods for gaining access to this hard-to-identify workforce. We outline methods for surveying CHWs and promising approaches to engage the workforce and other stakeholders in conducting local, state, and national studies. We also highlight successful strategies to overcome challenges in CHW surveys and future directions for surveying the field.
Teaching Structure from Motion to Undergraduates: New Learning Module for Field Geoscience Courses
NASA Astrophysics Data System (ADS)
Pratt-Sitaula, B. A.; Shervais, K.; Crosby, C. J.; Douglas, B. J.; Crosby, B. T.; Charlevoix, D. J.
2016-12-01
With photogrammetry use expanding rapidly, it is essential to integrate these methods into undergraduate geosciences courses. The NSF-funded "GEodetic Tools for Societal Issues" (GETSI) project has recently published a module for field geoscience courses called "Analyzing High Resolution Topography with TLS and SfM" (serc.carleton.edu/getsi/teaching_materials/high-rez-topo/index.html). Structure from motion (SfM) and terrestrial laser scanning (TLS) are two valuable methods for generating high-resolution topographic landscape models. In addition to teaching the basic surveying methods, the module includes several specific applications that are tied to societally important geoscience research questions. The module goals are that students will be able to: 1) design and conduct a complex TLS and/or SfM survey to address a geologic research question; 2) articulate the societal impetus for answering a given research question; and 3) justify why TLS and/or SfM is the appropriate method in some circumstances. The module includes 6 units: Unit 1-TLS Introduction, Unit 1-SfM Introduction, Unit 2 Stratigraphic Section Survey, Unit 3 Fault Scarp Survey, Unit 4 Geomorphic Change Detection Survey, and Unit 5 Summative Assessment. One or both survey methods can be taught. Instructors choose which application/s to use from Units 2-4. Unit 5 Summative Assessment is flexibly written and can be used to assess any of the learned applications or others such as dinosaur tracks or seismic trench photomosaics. Prepared data sets are also provided for courses unable to visit the field. The included SfM learning manuals may also be of interest to researchers seeking to start with SfM; these are "SfM Guide of Instructors and Investigators" and "SfM Data Exploration and Processing Manual (Agisoft)". The module is appropriate for geoscience courses with field components such as field methods, geomorphology, geophysics, tectonics, and structural geology. All GETSI modules are designed and developed by teams of faculty and content experts and undergo rigorous review and classroom testing. GETSI is a collaborative project by UNAVCO (which runs NSF's Geodetic Facility), Indiana University, and Idaho State University. The Science Education Resource Center (SERC) provides assessment and evaluation expertise and webhosting.
Create and Publish a Hierarchical Progressive Survey (HiPS)
NASA Astrophysics Data System (ADS)
Fernique, P.; Boch, T.; Pineau, F.; Oberto, A.
2014-05-01
Since 2009, the CDS promotes a method for visualizing based on the HEALPix sky tessellation. This method, called “Hierarchical Progressive Survey" or HiPS, allows one to display a survey progressively. It is particularly suited for all-sky surveys or deep fields. This visualization method is now integrated in several applications, notably Aladin, the SiTools/MIZAR CNES framework, and the recent HTML5 “Aladin Lite". Also, more than one hundred surveys are already available in this view mode. In this article, we will present the progress concerning this method and its recent adaptation to the astronomical catalogs such as the GAIA simulation.
NASA Astrophysics Data System (ADS)
Trofymow, J. A.; Gougeon, F.; Kelley, J. W.
2017-12-01
Forest carbon (C) models require knowledge on C transfers due to intense disturbances such as fire, harvest, and slash burning. In such events, live trees die and C transferred to detritus or exported as round wood. With burning, live and detrital C is lost as emissions. Burning can be incomplete, leaving wood, charred and scattered or in unburnt rings and piles. For harvests, all round wood volume is routinely measured, while dispersed and piled residue volumes are typically assessed in field surveys, scaled to a block. Recently, geospatial methods have been used to determine, for an entire block, piled residues using LiDAR or image point clouds (PC) and dispersed residues by analysis of high-resolution imagery. Second-growth Douglas-fir forests on eastern Vancouver Island were examined, 4 blocks at Oyster River (OR) and 2 at Northwest Bay (NB). OR blocks were cut winter 2011, piled spring 2011, field survey, aerial RGB imagery and LiDAR PC acquired fall 2011, piles burned, burn residues surveyed, and post-burn aerial RGB imagery acquired 2012. NB blocks were cut fall 2014, piled spring 2015, field survey, UAV RGB imagery and image PC acquired summer 2015, piles burned and burn residues surveyed spring 2016, and post-burn UAV RGB imagery and PC acquired fall 2016. Volume to biomass conversion used survey species proportions and wood density. At OR, round wood was 261.7 SE 13.1, firewood 1.7 SE 0.3, and dispersed residue by survey, 13.8 SE 3.6 tonnes dry mass (t dm) ha-1. Piled residues were 8.2 SE 0.9 from pile surveys vs. 25.0 SE 5.9 t dm ha-1 from LiDAR PC bulk pile volumes and packing ratios. Post-burn, piles lost 5.8 SE 0.5 from survey of burn residues vs. 18.2 SE 4.7 t dm ha-1 from pile volume changes using 2011 LiDAR PC and 2012 imagery. The percentage of initial merchantable biomass exported as round & fire wood, remaining as dispersed & piled residue, and lost to burning was, respectively, 92.5%, 5.5% and 2% using only field methods vs. 87%, 7% and 6% from dispersed residues surveys and LIDAR PC pile volumes. At NB, preliminary analysis shows the post-burn difference in 2015 to 2016 UAV PC pile volumes, was similar to that using 2015 UAV PC pile volume and 2016 orthophoto pile area burned, suggesting the two geospatial methods are comparable. Comparisons will be made for transfers in all 6 blocks using only field survey or geospatial methods.
A survey on object detection in optical remote sensing images
NASA Astrophysics Data System (ADS)
Cheng, Gong; Han, Junwei
2016-07-01
Object detection in optical remote sensing images, being a fundamental but challenging problem in the field of aerial and satellite image analysis, plays an important role for a wide range of applications and is receiving significant attention in recent years. While enormous methods exist, a deep review of the literature concerning generic object detection is still lacking. This paper aims to provide a review of the recent progress in this field. Different from several previously published surveys that focus on a specific object class such as building and road, we concentrate on more generic object categories including, but are not limited to, road, building, tree, vehicle, ship, airport, urban-area. Covering about 270 publications we survey (1) template matching-based object detection methods, (2) knowledge-based object detection methods, (3) object-based image analysis (OBIA)-based object detection methods, (4) machine learning-based object detection methods, and (5) five publicly available datasets and three standard evaluation metrics. We also discuss the challenges of current studies and propose two promising research directions, namely deep learning-based feature representation and weakly supervised learning-based geospatial object detection. It is our hope that this survey will be beneficial for the researchers to have better understanding of this research field.
Kurata, Keiko; Morioka, Tomoko; Yokoi, Keiko; Matsubayashi, Mamiko
2013-01-01
Introduction This study clarifies the trends observed in open access (OA) in the biomedical field between 2006 and 2010, and explores the possible explanations for the differences in OA rates revealed in recent surveys. Methods The study consists of a main survey and two supplementary surveys. In the main survey, a manual Google search was performed to investigate whether full-text versions of articles from PubMed were freely available. Target samples were articles published in 2005, 2007, and 2009; the searches were performed a year after publication in 2006, 2008, and 2010, respectively. Using the search results, we classified the OA provision methods into seven categories. The supplementary surveys calculated the OA rate using two search functions on PubMed: “LinkOut” and “Limits.” Results The main survey concluded that the OA rate increased significantly between 2006 and 2010: the OA rate in 2010 (50.2%) was twice that in 2006 (26.3%). Furthermore, majority of OA articles were available from OA journal (OAJ) websites, indicating that OAJs have consistently been a significant contributor to OA throughout the period. OA availability through the PubMed Central (PMC) repository also increased significantly. OA rates obtained from two supplementary surveys were lower than those found in the main survey. “LinkOut” could find only 40% of OA articles in the main survey. Discussion OA articles in the biomedical field have more than a 50% share. OA has been achieved through OAJs. The reason why the OA rates in our surveys are different from those in recent surveys seems to be the difference in sampling methods and verification procedures. PMID:23658683
The Impact of Sound-Field Systems on Learning and Attention in Elementary School Classrooms
ERIC Educational Resources Information Center
Dockrell, Julie E.; Shield, Bridget
2012-01-01
Purpose: The authors evaluated the installation and use of sound-field systems to investigate the impact of these systems on teaching and learning in elementary school classrooms. Methods: The evaluation included acoustic surveys of classrooms, questionnaire surveys of students and teachers, and experimental testing of students with and without…
Methods for the survey and genetic analysis of populations
Ashby, Matthew
2003-09-02
The present invention relates to methods for performing surveys of the genetic diversity of a population. The invention also relates to methods for performing genetic analyses of a population. The invention further relates to methods for the creation of databases comprising the survey information and the databases created by these methods. The invention also relates to methods for analyzing the information to correlate the presence of nucleic acid markers with desired parameters in a sample. These methods have application in the fields of geochemical exploration, agriculture, bioremediation, environmental analysis, clinical microbiology, forensic science and medicine.
Nkouawa, Agathe; Sako, Yasuhito; Li, Tiaoying; Chen, Xingwang; Nakao, Minoru; Yanagida, Tetsuya; Okamoto, Munehiro; Giraudoux, Patrick; Raoul, Francis; Nakaya, Kazuhiro; Xiao, Ning; Qiu, Jiamin; Qiu, Dongchuan; Craig, Philip S; Ito, Akira
2012-12-01
In this study, we applied a loop-mediated isothermal amplification method for identification of human Taenia tapeworms in Tibetan communities in Sichuan, China. Out of 51 proglottids recovered from 35 carriers, 9, 1, and 41 samples were identified as Taenia solium, Taenia asiatica and Taenia saginata, respectively. Same results were obtained afterwards in the laboratory, except one sample. These results demonstrated that the LAMP method enabled rapid identification of parasites in the field surveys, which suggested that this method would contribute to the control of Taenia infections in endemic areas. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Comparability of river suspended-sediment sampling and laboratory analysis methods
Groten, Joel T.; Johnson, Gregory D.
2018-03-06
Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.
Cheng, Gong; Huang, Lu-qi; Xue, Da-yuan; Zhang, Xiao-bo
2014-12-01
The survey of traditional knowledge related to Chinese materia medica resources is the important component and one of the innovative aspects of the fourth national survey of the Chinese materia medica resources. China has rich traditional knowledge of traditional Chinese medicine (TCM) and the comprehensive investigation of TCM traditional knowledge aims to promote conservation and sustainable use of Chinese materia medica resources. Building upon the field work of pilot investigations, this paper introduces the essential procedures and key methods for conducting the survey of traditional knowledge related to Chinese materia medica resources. The essential procedures are as follows. First is the preparation phrase. It is important to review all relevant literature and provide training to the survey teams so that they have clear understanding of the concept of traditional knowledge and master key survey methods. Second is the field investigation phrase. When conducting field investigations, survey teams should identify the traditional knowledge holders by using the 'snowball method', record the traditional knowledge after obtaining prior informed concerned from the traditional knowledge holders. Researchers should fill out the survey forms provided by the Technical Specification of the Fourth National Survey of Chinese Materia Medica Resources. Researchers should pay particular attention to the scope of traditional knowledge and the method of inheriting the knowledge, which are the key information for traditional knowledge holders and potential users to reach mutual agreed terms to achieve benefit sharing. Third is the data compilation and analysis phrase. Researchers should try to compile and edit the TCM traditional knowledge in accordance with intellectual property rights requirements so that the information collected through the national survey can serve as the basic data for the TCM traditional knowledge database. The key methods of the survey include regional division of Chinese materia medica resources, interview of key information holders and standardization of information.' In particular, using "snowball method" can effectively identify traditional knowledge holder in the targeted regions and ensuring traditional knowledge holders receiving prior informed concerned before sharing the information with researcher to make sure the rights of traditional knowledge holders are protected. Employing right survey methods is not only the key to obtain traditional knowledge related to Chinese materia medica resources, but also the pathway to fulfill the objectives of access and benefit sharing stipulated in Convention on Biological Resources. It will promote the legal protection of TCM traditional knowledge and conservation of TCM intangible, cultural heritage.
Survey of predators and sampling method comparison in sweet corn.
Musser, Fred R; Nyrop, Jan P; Shelton, Anthony M
2004-02-01
Natural predation is an important component of integrated pest management that is often overlooked because it is difficult to quantify and perceived to be unreliable. To begin incorporating natural predation into sweet corn, Zea mays L., pest management, a predator survey was conducted and then three sampling methods were compared for their ability to accurately monitor the most abundant predators. A predator survey on sweet corn foliage in New York between 1999 and 2001 identified 13 species. Orius insidiosus (Say), Coleomegilla maculata (De Geer), and Harmonia axyridis (Pallas) were the most numerous predators in all years. To determine the best method for sampling adult and immature stages of these predators, comparisons were made among nondestructive field counts, destructive counts, and yellow sticky cards. Field counts were correlated with destructive counts for all populations, but field counts of small insects were biased. Sticky cards underrepresented immature populations. Yellow sticky cards were more attractive to C. maculata adults than H. axyridis adults, especially before pollen shed, making coccinellid population estimates based on sticky cards unreliable. Field counts were the most precise method for monitoring adult and immature stages of the three major predators. Future research on predicting predation of pests in sweet corn should be based on field counts of predators because these counts are accurate, have no associated supply costs, and can be made quickly.
Terminating Sequential Delphi Survey Data Collection
ERIC Educational Resources Information Center
Kalaian, Sema A.; Kasim, Rafa M.
2012-01-01
The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey…
Multiple field-based methods to assess the potential impacts of seismic surveys on scallops.
Przeslawski, Rachel; Huang, Zhi; Anderson, Jade; Carroll, Andrew G; Edmunds, Matthew; Hurt, Lynton; Williams, Stefan
2018-04-01
Marine seismic surveys are an important tool to map geology beneath the seafloor and manage petroleum resources, but they are also a source of underwater noise pollution. A mass mortality of scallops in the Bass Strait, Australia occurred a few months after a marine seismic survey in 2010, and fishing groups were concerned about the potential relationship between the two events. The current study used three field-based methods to investigate the potential impact of marine seismic surveys on scallops in the region: 1) dredging and 2) deployment of Autonomous Underwater Vehicles (AUVs) were undertaken to examine the potential response of two species of scallops (Pecten fumatus, Mimachlamys asperrima) before, two months after, and ten months after a 2015 marine seismic survey; and 3) MODIS satellite data revealed patterns of sea surface temperatures from 2006-2016. Results from the dredging and AUV components show no evidence of scallop mortality attributable to the seismic survey, although sub-lethal effects cannot be excluded. The remote sensing revealed a pronounced thermal spike in the eastern Bass Strait between February and May 2010, overlapping the scallop beds that suffered extensive mortality and coinciding almost exactly with dates of operation for the 2010 seismic survey. The acquisition of in situ data coupled with consideration of commercial seismic arrays meant that results were ecologically realistic, while the paired field-based components (dredging, AUV imagery) provided a failsafe against challenges associated with working wholly in the field. This study expands our knowledge of the potential environmental impacts of marine seismic survey and will inform future applications for marine seismic surveys, as well as the assessment of such applications by regulatory authorities. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
3D Inversion of Natural Source Electromagnetics
NASA Astrophysics Data System (ADS)
Holtham, E. M.; Oldenburg, D. W.
2010-12-01
The superior depth of investigation of natural source electromagnetic techniques makes these methods excellent candidates for crustal studies as well as for mining and hydrocarbon exploration. The traditional natural source method, the magnetotelluric (MT) technique, has practical limitations because the surveys are costly and time consuming due to the labor intensive nature of ground based surveys. In an effort to continue to use the penetration advantage of natural sources, it has long been recognized that tipper data, the ratio of the local vertical magnetic field to the horizontal magnetic field, provide information about 3D electrical conductivity structure. It was this understanding that prompted the development of AFMAG (Audio Frequency Magnetics) and recently the new airborne Z-Axis Tipper Electromagnetic Technique (ZTEM). In ZTEM, the vertical component of the magnetic field is recorded above the entire survey area, while the horizontal fields are recorded at a ground-based reference station. MT processing techniques yield frequency domain transfer functions typically between 30-720 Hz that relate the vertical fields over the survey area to the horizontal fields at the reference station. The result is a cost effective procedure for collecting natural source EM data and for finding large scale targets at moderate depths. It is well known however that 1D layered structures produce zero vertical magnetic fields and thus ZTEM data cannot recover such background conductivities. This is in sharp contrast to the MT technique where electric fields are measured and a 1D background conductivity can be recovered from the off diagonal elements of the impedance tensor. While 1D models produce no vertical fields, two and three dimensional structures will produce anomalous currents and a ZTEM response. For such models the background conductivity structure does affect the data. In general however, the ZTEM data have weak sensitivity to the background conductivity and while we show that it is possible to obtain the background structure by inverting the ZTEM data alone, it is desirable to obtain robust background conductivity information from other sources. This information could come from a priori geologic and petrophysical information or from additional geophysical data such as MT. To counter the costly nature of large MT surveys and the limited sensitivity of the ZTEM technique to the background conductivity we show that an effective method is to collect and invert both MT and ZTEM data. A sparse MT survey grid can gather information about the background conductivity and deep structures while keeping the survey costs affordable. Higher spatial resolution at moderate depths can be obtained by flying multiple lines of ZTEM data.
Grais, Rebecca F; Luquero, Francisco J; Grellety, Emmanuel; Pham, Heloise; Coghlan, Benjamin; Salignon, Pierre
2009-01-01
Survey estimates of mortality and malnutrition are commonly used to guide humanitarian decision-making. Currently, different methods of conducting field surveys are the subject of debate among epidemiologists. Beyond the technical arguments, decision makers may find it difficult to conceptualize what the estimates actually mean. For instance, what makes this particular situation an emergency? And how should the operational response be adapted accordingly. This brings into question not only the quality of the survey methodology, but also the difficulties epidemiologists face in interpreting results and selecting the most important information to guide operations. As a case study, we reviewed mortality and nutritional surveys conducted in North Kivu, Democratic Republic of Congo (DRC) published from January 2006 to January 2009. We performed a PubMed/Medline search for published articles and scanned publicly available humanitarian databases and clearinghouses for grey literature. To evaluate the surveys, we developed minimum reporting criteria based on available guidelines and selected peer-review articles. We identified 38 reports through our search strategy; three surveys met our inclusion criteria. The surveys varied in methodological quality. Reporting against minimum criteria was generally good, but presentation of ethical procedures, raw data and survey limitations were missed in all surveys. All surveys also failed to consider contextual factors important for data interpretation. From this review, we conclude that mechanisms to ensure sound survey design and conduct must be implemented by operational organisations to improve data quality and reporting. Training in data interpretation would also be useful. Novel survey methods should be trialled and prospective data gathering (surveillance) employed wherever feasible. PMID:19744319
Clare, John; McKinney, Shawn T.; DePue, John E.; Loftin, Cynthia S.
2017-01-01
It is common to use multiple field sampling methods when implementing wildlife surveys to compare method efficacy or cost efficiency, integrate distinct pieces of information provided by separate methods, or evaluate method-specific biases and misclassification error. Existing models that combine information from multiple field methods or sampling devices permit rigorous comparison of method-specific detection parameters, enable estimation of additional parameters such as false-positive detection probability, and improve occurrence or abundance estimates, but with the assumption that the separate sampling methods produce detections independently of one another. This assumption is tenuous if methods are paired or deployed in close proximity simultaneously, a common practice that reduces the additional effort required to implement multiple methods and reduces the risk that differences between method-specific detection parameters are confounded by other environmental factors. We develop occupancy and spatial capture–recapture models that permit covariance between the detections produced by different methods, use simulation to compare estimator performance of the new models to models assuming independence, and provide an empirical application based on American marten (Martes americana) surveys using paired remote cameras, hair catches, and snow tracking. Simulation results indicate existing models that assume that methods independently detect organisms produce biased parameter estimates and substantially understate estimate uncertainty when this assumption is violated, while our reformulated models are robust to either methodological independence or covariance. Empirical results suggested that remote cameras and snow tracking had comparable probability of detecting present martens, but that snow tracking also produced false-positive marten detections that could potentially substantially bias distribution estimates if not corrected for. Remote cameras detected marten individuals more readily than passive hair catches. Inability to photographically distinguish individual sex did not appear to induce negative bias in camera density estimates; instead, hair catches appeared to produce detection competition between individuals that may have been a source of negative bias. Our model reformulations broaden the range of circumstances in which analyses incorporating multiple sources of information can be robustly used, and our empirical results demonstrate that using multiple field-methods can enhance inferences regarding ecological parameters of interest and improve understanding of how reliably survey methods sample these parameters.
FIELD OPERATIONS AND METHODS FOR MEASURING THE ECOLOGICAL CONDITION OF WADEABLE STREAMS
The methods and instructions for field operations presented in this manual for surveys of wadeable streams were developed and tested during 5 years of pilot and demonstration projects (1993 through 1997). These projects were conducted under the sponsorship of the U.S. Environment...
EVALUATION OF A MEASUREMENT METHOD FOR FOREST VEGETATION IN A LARGE-SCALE ECOLOGICAL SURVEY
We evaluate a field method for determining species richness and canopy cover of vascular plants for the Forest Health Monitoring Program (FHM), an ecological survey of U.S. forests. Measurements are taken within 12 1-m2 quadrats on 1/15 ha plots in FHM. Species richness and cover...
LINKING JUVENILE FISH AND THEIR HABITATS: AN EXAMPLE FROM NARRAGANSETT BAY ,RHODE ISLAND
We used two methods and existing field survey data to link juvenile fish and their habitats. The first method used seine survey data collected monthly from July to October 1988-1996 at fixed stations in Narragansett Bay, Rhode Island. Thirteen fish species making up 1% or more of...
Too Fast to Measure: Network Adjustment of Rapidly Changing Gravity Fields
NASA Astrophysics Data System (ADS)
Kennedy, J.; Ferre, T. P. A.
2014-12-01
Measurements of spatially-variable gravity at the field scale are difficult; measurements of the time-varying field even more so. Every previous gravity survey using relative gravimeters—still the workhorse of gravity studies, despite their nearly 80 year history—has assumed a static gravity field during the course of a survey, which may last days to weeks. With recently-improved instrumentation, however, measurements of fields changing on the order of tens of nm/sec2 per day are now possible. In particular, the A-10 portable absolute gravimeter provides not only absolute control, but also the change in that control during the course of a survey. Using digitally-recording spring-based relative gravimeters (namely, the ZLS Burris meter and the Scintrex CG-5), with their more efficient data collection and lower drift than previous generations, many more data are collected in a day. We demonstrate a method for incorporating in the least-squares network adjustment of relative gravity data a relation between the rate of change of gravity, dg, and distance from an infiltration source, x. This relation accounts for the fact that gravity at stations adjacent to the infiltration source changes more rapidly than stations further away; if all measurements collected over several days are to be included in a single network-adjustment, consideration of this change is required. Two methods are used to simulate the dg(x) relation: a simple model where dg is a linear function of x, and a coupled-hydrogeophysical method where a groundwater flow model predicts the nonlinear spatial variation of dg. Then, the change in gravity between different, independently adjusted surveys is used to parameterize the groundwater model. Data from two recent field examples, an artificial recharge facility near Tucson, Arizona, USA, and from the 2014 Lower Colorado River pulse flow experiment, clearly show the need to account for gravity change during a survey; maximum rates of change for the two studies were up to 30 and 50 nm/sec2 per day, respectively.
Transient ElectroMagnetic and Electric Self-Potential survey in the TAG hydrothermal field in MAR
NASA Astrophysics Data System (ADS)
Tao, C.; Deng, X.; Wu, G.; Xi, Z.; Zhou, D.; Zuo, L.
2012-12-01
The TAG hydrothermal field is one of the most studied hydrothermal fields. This field covers an area of 5km×5km, which includes low-temperature Mn- and Fe-oxides and nontronites zone, relict massive sulfide mounds as well as active hydrothermal mound(TAG mound) [Thompson, 1985, Rona, 1993]. Drilling program was performed in the ODP (Ocean Drilling Program) Leg 158 in the TAG mound [Humphris, 1996]. In 1996, electrical resistivity survey in the TAG mound was conducted using innovative transient electric dipole-dipole instruments which was carried by DSV 'Alvin' [Cairns et al., 1996, Von Herzen et al., 1996]. In June 2012, the 2nd Leg of the Chinese 26th cruise was carried out in the TAG hydrothermal field at Mid Atlantic Ridge by R/V DAYANGYIHAO. Six TEM (Transient ElectroMagnetic) survey lines were deployed, with four of which across the ODP Leg 158 drilling area. Besides, two SP (Electric Self-Potential) survey lines were across the ODP drilling area. The survey results of TEM preliminary revealed the vertical structure of the TAG hydrothermal field. The survey results of both TEM and SP are consistent with the ODP drilling result, and also agree well with the temperature and water-column anomalies obtained in this leg. Preliminary results show that the TEM and SP methods are capable of revealing the horizontal and vertical distribution of the hydrothermal sulfide fields.
A Low-Cost Energy-Efficient Cableless Geophone Unit for Passive Surface Wave Surveys.
Dai, Kaoshan; Li, Xiaofeng; Lu, Chuan; You, Qingyu; Huang, Zhenhua; Wu, H Felix
2015-09-25
The passive surface wave survey is a practical, non-invasive seismic exploration method that has increasingly been used in geotechnical engineering. However, in situ deployment of traditional wired geophones is labor intensive for a dense sensor array. Alternatively, stand-alone seismometers can be used, but they are bulky, heavy, and expensive because they are usually designed for long-term monitoring. To better facilitate field applications of the passive surface wave survey, a low-cost energy-efficient geophone system was developed in this study. The hardware design is presented in this paper. To validate the system's functionality, both laboratory and field experiments were conducted. The unique feature of this newly-developed cableless geophone system allows for rapid field applications of the passive surface wave survey with dense array measurements.
Sensitivity analysis of discrete structural systems: A survey
NASA Technical Reports Server (NTRS)
Adelman, H. M.; Haftka, R. T.
1984-01-01
Methods for calculating sensitivity derivatives for discrete structural systems are surveyed, primarily covering literature published during the past two decades. Methods are described for calculating derivatives of static displacements and stresses, eigenvalues and eigenvectors, transient structural response, and derivatives of optimum structural designs with respect to problem parameters. The survey is focused on publications addressed to structural analysis, but also includes a number of methods developed in nonstructural fields such as electronics, controls, and physical chemistry which are directly applicable to structural problems. Most notable among the nonstructural-based methods are the adjoint variable technique from control theory, and the Green's function and FAST methods from physical chemistry.
ERIC Educational Resources Information Center
Thomson, Pat
2017-01-01
The field of educational leadership, management and administration (ELMA) uses methods drawn primarily from cognate educational disciplines. But does this matter? This paper explores the methods used in recently published papers through a snapshot of six issues of six ELMA journals. The analysis showed a preponderance of survey, interview and case…
The IMACS Cluster Building Survey. I. Description of the Survey and Analysis Methods
NASA Technical Reports Server (NTRS)
Oemler Jr., Augustus; Dressler, Alan; Gladders, Michael G.; Rigby, Jane R.; Bai, Lei; Kelson, Daniel; Villanueva, Edward; Fritz, Jacopo; Rieke, George; Poggianti, Bianca M.;
2013-01-01
The IMACS Cluster Building Survey uses the wide field spectroscopic capabilities of the IMACS spectrograph on the 6.5 m Baade Telescope to survey the large-scale environment surrounding rich intermediate-redshift clusters of galaxies. The goal is to understand the processes which may be transforming star-forming field galaxies into quiescent cluster members as groups and individual galaxies fall into the cluster from the surrounding supercluster. This first paper describes the survey: the data taking and reduction methods. We provide new calibrations of star formation rates (SFRs) derived from optical and infrared spectroscopy and photometry. We demonstrate that there is a tight relation between the observed SFR per unit B luminosity, and the ratio of the extinctions of the stellar continuum and the optical emission lines.With this, we can obtain accurate extinction-corrected colors of galaxies. Using these colors as well as other spectral measures, we determine new criteria for the existence of ongoing and recent starbursts in galaxies.
NASA Astrophysics Data System (ADS)
Brekhov, O. M.; Tsvetkov, Yu. P.; Ivanov, V. V.; Filippov, S. V.; Tsvetkova, N. M.
2015-09-01
The results of stratospheric balloon gradient geomagnetic surveys at an altitude of ‘-~3O km with the use of the long (6 km) measuring base oriented along the vertical line are considered. The purposes of these surveys are the study of the magnetic field formed by deep sources, and the estimation of errors in modern analytical models of the geomagnetic field. The independent method of determination of errors in global analytical models of the normal magnetic field of the Earth (MFE) is substantiated. The new technique of identification of magnetic anomalies from surveys on long routes is considered. The analysis of gradient magnetic surveys on board the balloon, revealed the previously unknown features of the geomagnetic field. Using the balloon data, the EMM/720 model of the geomagnetic field (http://www.ngdc.noaa.gov/geomag/EMM) is investigated, and it is shown that this model unsatisfactorily represents the anomalous MFE, at least, at an altitude of 30 km, in the area our surveys. The unsatisfactory quality of aeromagnetic (ground-based) data is also revealed by the method of wavelet analysis of the ground-based and balloon magnetic profiles. It is shown, that the ground-based profiles do not contain inhomogeneities more than 1 30 km in size, whereas the balloon profiles (1000 km in the strike extent) contain inhomogeneities up to 600 km in size an the location of the latte coincides with the location of the satellite magnetic anomaly. On the basis of balloon data is shown, it that low-altitude aeromagnetic surveys, due to fundamental reasons, incorrectly reproduce the magnetic field of deep sources. This prevents the reliable conversion of ground-based magnetic anomalies upward from the surface of the Earth. It is shown, that an adequate global model of magnetic anomalies in the circumterrestrial space, developed up to 720 spherical harmonics, must be constructed only in accordance with the data obtained at satellite and stratospheric altitudes. Such a model can serve as a basis for the refined study of the structure and magnetic properties of the Earth's crust at its deep horizons, in order to search for resources at them, and so on.
Zhou, C.; Liu, L.; Lane, J.W.
2001-01-01
A nonlinear tomographic inversion method that uses first-arrival travel-time and amplitude-spectra information from cross-hole radar measurements was developed to simultaneously reconstruct electromagnetic velocity and attenuation distribution in earth materials. Inversion methods were developed to analyze single cross-hole tomography surveys and differential tomography surveys. Assuming the earth behaves as a linear system, the inversion methods do not require estimation of source radiation pattern, receiver coupling, or geometrical spreading. The data analysis and tomographic inversion algorithm were applied to synthetic test data and to cross-hole radar field data provided by the US Geological Survey (USGS). The cross-hole radar field data were acquired at the USGS fractured-rock field research site at Mirror Lake near Thornton, New Hampshire, before and after injection of a saline tracer, to monitor the transport of electrically conductive fluids in the image plane. Results from the synthetic data test demonstrate the algorithm computational efficiency and indicate that the method robustly can reconstruct electromagnetic (EM) wave velocity and attenuation distribution in earth materials. The field test results outline zones of velocity and attenuation anomalies consistent with the finding of previous investigators; however, the tomograms appear to be quite smooth. Further work is needed to effectively find the optimal smoothness criterion in applying the Tikhonov regularization in the nonlinear inversion algorithms for cross-hole radar tomography. ?? 2001 Elsevier Science B.V. All rights reserved.
Is flat fielding safe for precision CCD astronomy?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumer, Michael; Davis, Christopher P.; Roodman, Aaron
The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less
Is flat fielding safe for precision CCD astronomy?
Baumer, Michael; Davis, Christopher P.; Roodman, Aaron
2017-07-06
The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less
Survey Field Methods for Expanded Biospecimen and Biomeasure Collection in NSHAP Wave 2
Jaszczak, Angela; Hoffmann, Joscelyn N.; You, Hannah M.; Kern, David W.; Pagel, Kristina; McPhillips, Jane; Schumm, L. Philip; Dale, William; Huang, Elbert S.; McClintock, Martha K.
2014-01-01
Objectives. The National Social Life, Health, and Aging Project is a nationally representative, longitudinal survey of older adults. A main component is the collection of biomeasures to objectively assess physiological status relevant to psychosocial variables, aging conditions, and disease. Wave 2 added novel biomeasures, refined those collected in Wave 1, and provides a reference for the collection protocols and strategy common to the biomeasures. The effects of aging, gender, and their interaction are presented in the specific biomeasure papers included in this Special Issue. Method. A transdisciplinary working group expanded the biomeasures collected to include physiological, genetic, anthropometric, functional, neuropsychological, and sensory measures, yielding 37 more than in Wave 1. All were designed for collection in respondents’ homes by nonmedically trained field interviewers. Results. Both repeated and novel biomeasures were successful. Those in Wave 1 were refined to improve quality, and ensure consistency for longitudinal analysis. Four new biospecimens yielded 27 novel measures. During the interview, 19 biomeasures were recorded covering anthropometric, functional, neuropsychological, and sensory measures and actigraphy provided data on activity and sleep. Discussion. Improved field methods included in-home collection, temperature control, establishment of a central survey biomeasure laboratory, and shipping, all of which were crucial for successful collection by the field interviewers and accurate laboratory assay of the biomeasures (92.1% average co-operation rate and 97.3% average assay success rate). Developed for home interviews, these biomeasures are readily applicable to other surveys. PMID:25360025
DNA-based methods of geochemical prospecting
Ashby, Matthew [Mill Valley, CA
2011-12-06
The present invention relates to methods for performing surveys of the genetic diversity of a population. The invention also relates to methods for performing genetic analyses of a population. The invention further relates to methods for the creation of databases comprising the survey information and the databases created by these methods. The invention also relates to methods for analyzing the information to correlate the presence of nucleic acid markers with desired parameters in a sample. These methods have application in the fields of geochemical exploration, agriculture, bioremediation, environmental analysis, clinical microbiology, forensic science and medicine.
A survey of infrared and visual image fusion methods
NASA Astrophysics Data System (ADS)
Jin, Xin; Jiang, Qian; Yao, Shaowen; Zhou, Dongming; Nie, Rencan; Hai, Jinjin; He, Kangjian
2017-09-01
Infrared (IR) and visual (VI) image fusion is designed to fuse multiple source images into a comprehensive image to boost imaging quality and reduce redundancy information, which is widely used in various imaging equipment to improve the visual ability of human and robot. The accurate, reliable and complementary descriptions of the scene in fused images make these techniques be widely used in various fields. In recent years, a large number of fusion methods for IR and VI images have been proposed due to the ever-growing demands and the progress of image representation methods; however, there has not been published an integrated survey paper about this field in last several years. Therefore, we make a survey to report the algorithmic developments of IR and VI image fusion. In this paper, we first characterize the IR and VI image fusion based applications to represent an overview of the research status. Then we present a synthesize survey of the state of the art. Thirdly, the frequently-used image fusion quality measures are introduced. Fourthly, we perform some experiments of typical methods and make corresponding analysis. At last, we summarize the corresponding tendencies and challenges in IR and VI image fusion. This survey concludes that although various IR and VI image fusion methods have been proposed, there still exist further improvements or potential research directions in different applications of IR and VI image fusion.
The methods and instructions for field operations presented in this manual for surveys of non-wadeable streams and rivers were developed and tested based on 55 sample sites in the Mid-Atlantic region and 53 sites in an Oregon study during two years of pilot and demonstration proj...
A Low-Cost Energy-Efficient Cableless Geophone Unit for Passive Surface Wave Surveys
Dai, Kaoshan; Li, Xiaofeng; Lu, Chuan; You, Qingyu; Huang, Zhenhua; Wu, H. Felix
2015-01-01
The passive surface wave survey is a practical, non-invasive seismic exploration method that has increasingly been used in geotechnical engineering. However, in situ deployment of traditional wired geophones is labor intensive for a dense sensor array. Alternatively, stand-alone seismometers can be used, but they are bulky, heavy, and expensive because they are usually designed for long-term monitoring. To better facilitate field applications of the passive surface wave survey, a low-cost energy-efficient geophone system was developed in this study. The hardware design is presented in this paper. To validate the system’s functionality, both laboratory and field experiments were conducted. The unique feature of this newly-developed cableless geophone system allows for rapid field applications of the passive surface wave survey with dense array measurements. PMID:26404270
Surveying immigrants without sampling frames - evaluating the success of alternative field methods.
Reichel, David; Morales, Laura
2017-01-01
This paper evaluates the sampling methods of an international survey, the Immigrant Citizens Survey, which aimed at surveying immigrants from outside the European Union (EU) in 15 cities in seven EU countries. In five countries, no sample frame was available for the target population. Consequently, alternative ways to obtain a representative sample had to be found. In three countries 'location sampling' was employed, while in two countries traditional methods were used with adaptations to reach the target population. The paper assesses the main methodological challenges of carrying out a survey among a group of immigrants for whom no sampling frame exists. The samples of the survey in these five countries are compared to results of official statistics in order to assess the accuracy of the samples obtained through the different sampling methods. It can be shown that alternative sampling methods can provide meaningful results in terms of core demographic characteristics although some estimates differ to some extent from the census results.
USDA-ARS?s Scientific Manuscript database
Structure-from-motion (SfM) photogrammetry from unmanned aircraft system (UAS) imagery is an emerging tool for repeat topographic surveying of dryland erosion. These methods are particularly appealing due to the ability to cover large landscapes compared to field methods and at reduced costs and hig...
ERIC Educational Resources Information Center
Wells, Elizabeth Chase
2009-01-01
This research study examined Michigan State University Extension educators' perceptions of the use of digital technology in their work. It used a mixed method of research which included a mailed survey and interviews of selected respondents. A census survey using Dillman's Total Design method was sent to 290 field staff of Michigan State…
Evaluation of selected methods for determining streamflow during periods of ice effect
Melcher, Norwood B.; Walker, J.F.
1992-01-01
Seventeen methods for estimating ice-affected streamflow are evaluated for potential use with the U.S. Geological Survey streamflow-gaging station network. The methods evaluated were identified by written responses from U.S. Geological Survey field offices and by a comprehensive literature search. The methods selected and techniques used for applying the methods are described in this report. The methods are evaluated by comparing estimated results with data collected at three streamflow-gaging stations in Iowa during the winter of 1987-88. Discharge measurements were obtained at 1- to 5-day intervals during the ice-affected periods at the three stations to define an accurate baseline record. Discharge records were compiled for each method based on data available, assuming a 6-week field schedule. The methods are classified into two general categories-subjective and analytical--depending on whether individual judgment is necessary for method application. On the basis of results of the evaluation for the three Iowa stations, two of the subjective methods (discharge ratio and hydrographic-and-climatic comparison) were more accurate than the other subjective methods and approximately as accurate as the best analytical method. Three of the analytical methods (index velocity, adjusted rating curve, and uniform flow) could potentially be used at streamflow-gaging stations, where the need for accurate ice-affected discharge estimates justifies the expense of collecting additional field data. One analytical method (ice-adjustment factor) may be appropriate for use at stations with extremely stable stage-discharge ratings and measuring sections. Further research is needed to refine the analytical methods. The discharge-ratio and multiple-regression methods produce estimates of streamflow for varying ice conditions using information obtained from the existing U.S. Geological Survey streamflow-gaging network.
Clare, John; McKinney, Shawn T; DePue, John E; Loftin, Cynthia S
2017-10-01
It is common to use multiple field sampling methods when implementing wildlife surveys to compare method efficacy or cost efficiency, integrate distinct pieces of information provided by separate methods, or evaluate method-specific biases and misclassification error. Existing models that combine information from multiple field methods or sampling devices permit rigorous comparison of method-specific detection parameters, enable estimation of additional parameters such as false-positive detection probability, and improve occurrence or abundance estimates, but with the assumption that the separate sampling methods produce detections independently of one another. This assumption is tenuous if methods are paired or deployed in close proximity simultaneously, a common practice that reduces the additional effort required to implement multiple methods and reduces the risk that differences between method-specific detection parameters are confounded by other environmental factors. We develop occupancy and spatial capture-recapture models that permit covariance between the detections produced by different methods, use simulation to compare estimator performance of the new models to models assuming independence, and provide an empirical application based on American marten (Martes americana) surveys using paired remote cameras, hair catches, and snow tracking. Simulation results indicate existing models that assume that methods independently detect organisms produce biased parameter estimates and substantially understate estimate uncertainty when this assumption is violated, while our reformulated models are robust to either methodological independence or covariance. Empirical results suggested that remote cameras and snow tracking had comparable probability of detecting present martens, but that snow tracking also produced false-positive marten detections that could potentially substantially bias distribution estimates if not corrected for. Remote cameras detected marten individuals more readily than passive hair catches. Inability to photographically distinguish individual sex did not appear to induce negative bias in camera density estimates; instead, hair catches appeared to produce detection competition between individuals that may have been a source of negative bias. Our model reformulations broaden the range of circumstances in which analyses incorporating multiple sources of information can be robustly used, and our empirical results demonstrate that using multiple field-methods can enhance inferences regarding ecological parameters of interest and improve understanding of how reliably survey methods sample these parameters. © 2017 by the Ecological Society of America.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, E.S.
1988-01-01
An introduction to geophysical methods used to explore for natural resources and to survey earth's geology is presented in this volume. It is suitable for second-and third-year undergraduate students majoring in geology or engineering and for professional engineering and for professional engineers and earth scientists without formal instruction in geophysics. The author assumes the reader is familiar with geometry, algebra, and trigonometry. Geophysical exploration includes seismic refraction and reflection surveying, electrical resistivity and electromagnetic field surveying, and geophysical well logging. Surveying operations are described in step-by-step procedures and are illustrated by practical examples. Computer-based methods of processing and interpreting datamore » as well as geographical methods are introduced.« less
Researchers and the Rural Poor: Asking Questions in the Third World.
ERIC Educational Resources Information Center
Adams, William M.; Megaw, Charles C.
1997-01-01
Discusses the theory and practice of rural socioeconomic surveys in developing nations. Highlights the close links between choice of research topic, field area and research methods, and the ethics of field research. Offers a personal commentary on some practical problems concerning field research. (MJP)
NASA Astrophysics Data System (ADS)
Pratt-Sitaula, B. A.; Shervais, K.; Crosby, C. J.; Douglas, B. J.; Niemi, N. A.; Wang, G.; Charlevoix, D. J.
2015-12-01
Fieldwork is an integral part of the geosciences and there is a longstanding tradition of teaching field methods as part of the undergraduate curriculum. As new technology changes the ways in which we scientifically examine the Earth, and as workforce development demands evolve, there is growing interest in introducing these new technologies into field education courses. In collaboration with field education instructors, UNAVCO, the National Science Foundation's geodetic facility, has developed a module of teaching resources to integrate terrestrial lidar scanning into field courses. An NSF facility is well positioned to develop scalable resources that can then be distributed or adapted for broader implementation. The modules can also be accomplished using Structure from Motion methods in place of lidar scanning. Modules goals are for students to be able to: (A) design and conduct a complex TLS survey to address a geologic research question and (B) articulate the societal impetus for answering these research questions and identify why TLS is the appropriate method in some circumstances. The module is comprised of five units: (1) Introduction to survey design, (2) Stratigraphic section analysis, (3) Fault scarp analysis, (4) Geomorphic change detection, (5) Student-led survey design summative assessment. The modules, apart from the Introduction, are independent, thus select modules can be employed in a given field setting. Prototype module materials were developed from the last five years of UNAVCO support of undergraduate field courses. The current versions of the modules were tested in summer 2015 at the Indiana University and University of Michigan field camps. Results show that the majority of students are able to achieve the intended learning goals. Module materials are available on the UNAVCO Education and Community Engagement website.
Inversion of time-domain induced polarization data based on time-lapse concept
NASA Astrophysics Data System (ADS)
Kim, Bitnarae; Nam, Myung Jin; Kim, Hee Joon
2018-05-01
Induced polarization (IP) surveys, measuring overvoltage phenomena of the medium, are widely and increasingly performed not only for exploration of mineral resources but also for engineering applications. Among several IP survey methods such as time-domain, frequency-domain and spectral IP surveys, this study introduces a noble inversion method for time-domain IP data to recover the chargeability structure of target medium. The inversion method employs the concept of 4D inversion of time-lapse resistivity data sets, considering the fact that measured voltage in time-domain IP survey is distorted by IP effects to increase from the instantaneous voltage measured at the moment the source current injection starts. Even though the increase is saturated very fast, we can consider the saturated and instantaneous voltages as a time-lapse data set. The 4D inversion method is one of the most powerful method for inverting time-lapse resistivity data sets. Using the developed IP inversion algorithm, we invert not only synthetic but also field IP data to show the effectiveness of the proposed method by comparing the recovered chargeability models with those from linear inversion that was used for the inversion of the field data in a previous study. Numerical results confirm that the proposed inversion method generates reliable chargeability models even though the anomalous bodies have large IP effects.
Valuing Eastern Visibility: A Field Test of the Contingent Valuation Method (1993)
The report describes the Eastern visibility survey design in detail, presents the implementation of and data obtained from the surveys, provides summary statistics on the overall response and discusses the econometric techniques employed to value benefits.
Máthé, Koppány; Buşoniu, Lucian
2015-01-01
Unmanned aerial vehicles (UAVs) have gained significant attention in recent years. Low-cost platforms using inexpensive sensor payloads have been shown to provide satisfactory flight and navigation capabilities. In this report, we survey vision and control methods that can be applied to low-cost UAVs, and we list some popular inexpensive platforms and application fields where they are useful. We also highlight the sensor suites used where this information is available. We overview, among others, feature detection and tracking, optical flow and visual servoing, low-level stabilization and high-level planning methods. We then list popular low-cost UAVs, selecting mainly quadrotors. We discuss applications, restricting our focus to the field of infrastructure inspection. Finally, as an example, we formulate two use-cases for railway inspection, a less explored application field, and illustrate the usage of the vision and control techniques reviewed by selecting appropriate ones to tackle these use-cases. To select vision methods, we run a thorough set of experimental evaluations. PMID:26121608
A comparison of weed communities of coastal rice fields in Peninsular Malaysia.
Hakim, M A; Juraimi, Abdul Shukor; Hanafi, M M; Selamat, A
2013-09-01
A survey was conducted at 100 different rice fields in coastal areas of West Malaysia to identify most common and prevalent weeds associated with rice. Fields surveyed were done according to the quantitative survey method by using 0.5m x 0.5m size quadrate with 20 samples from each field. A total of 53 different weed species belong to 18 families were identified of which 32 annual and 21 perennial; 12 grassy, 13 sedges and 28 broadleaved weeds. Based on relative abundance the most prevalent and abundant weed species were selected in the coastal rice field. Among the 10 most abundant weed species, there were four grasses viz. Echinochloa crusgalli, Leptochloo chinensis, Echinochloo colona, Oryza sotivo L. (weedy rice).; four sedges viz. Fimbristylis miliacea, Cyperus iria, Cyperus difformis, Scirpus grossus and two broadleaved weeds viz. Sphenocleo zeylonica, Jussiaea linifolio. Leptochloa chinensis, E. crusgalli, F. miliocea, E. colona were more prevalent and abundant species out of the 10 most dominant weed species in the coastal rice field of Peninsular Malaysia.
An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang
2016-06-29
To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.
The Illinois State Water Survey hosted a three-week field intercomparison of several sulfate dry deposition measurement techniques during September 81. The site was an 80-acre grass field in a rural area 14 km southwest of Champaign, IL. The vegetation consisted of mixed grasses ...
[Investigation on field feces in schistosomiasis endemic areas in Jingzhou City].
Tian, Ke-qing; Wang, Jia-song; He, Liang-cai; Peng, You-xin
2014-04-01
To understand the status of field feces in Jingzhou City, so as to provide the evidence for improving the control measures to interrupt the transmission routes of schistosomiasis. The distribution of field feces was investigated in 27 schistosomiasis endemic villages in Gong' an, Jianli, Jiangling, Honghu and Shishou counties (cities) from 2010 to 2012. The schistosome positive status of the field feces was surveyed with the hatching method. There were 1366 field feces and the average density was 0.0892 feces per 100 square meters in this survey. The cattle feces, human feces, dog feces and elk feces respectively accounted for 99.71%, 0.07%, 0.15% and 0.07% in the survey. The infection rates of the field feces were 1.46% and 2.42% in the channels and bottomlands, respectively (P > 0.05). The average rate of infected field feces was 3.21% in 2010, 0.36% in 2011, and 1.60% in 2012, and the difference between 2010 and 2012 was not statistically significant (P > 0.05). The main field feces come from cattle, and the main distribution of infected field feces is in channels and bottom-lands. Therefore, the management of cattle and treatment of field feces should be strengthened.
NASA Technical Reports Server (NTRS)
Pierzga, M. J.
1981-01-01
The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.
Large-scale fluctuations in the number density of galaxies in independent surveys of deep fields
NASA Astrophysics Data System (ADS)
Shirokov, S. I.; Lovyagin, N. Yu.; Baryshev, Yu. V.; Gorokhov, V. L.
2016-06-01
New arguments supporting the reality of large-scale fluctuations in the density of the visible matter in deep galaxy surveys are presented. A statistical analysis of the radial distributions of galaxies in the COSMOS and HDF-N deep fields is presented. Independent spectral and photometric surveys exist for each field, carried out in different wavelength ranges and using different observing methods. Catalogs of photometric redshifts in the optical (COSMOS-Zphot) and infrared (UltraVISTA) were used for the COSMOS field in the redshift interval 0.1 < z < 3.5, as well as the zCOSMOS (10kZ) spectroscopic survey and the XMM-COSMOS and ALHAMBRA-F4 photometric redshift surveys. The HDFN-Zphot and ALHAMBRA-F5 catalogs of photometric redshifts were used for the HDF-N field. The Pearson correlation coefficient for the fluctuations in the numbers of galaxies obtained for independent surveys of the same deep field reaches R = 0.70 ± 0.16. The presence of this positive correlation supports the reality of fluctuations in the density of visible matter with sizes of up to 1000 Mpc and amplitudes of up to 20% at redshifts z ~ 2. The absence of correlations between the fluctuations in different fields (the correlation coefficient between COSMOS and HDF-N is R = -0.20 ± 0.31) testifies to the independence of structures visible in different directions on the celestial sphere. This also indicates an absence of any influence from universal systematic errors (such as "spectral voids"), which could imitate the detection of correlated structures.
Databases for rRNA gene profiling of microbial communities
Ashby, Matthew
2013-07-02
The present invention relates to methods for performing surveys of the genetic diversity of a population. The invention also relates to methods for performing genetic analyses of a population. The invention further relates to methods for the creation of databases comprising the survey information and the databases created by these methods. The invention also relates to methods for analyzing the information to correlate the presence of nucleic acid markers with desired parameters in a sample. These methods have application in the fields of geochemical exploration, agriculture, bioremediation, environmental analysis, clinical microbiology, forensic science and medicine.
Updating categorical soil maps using limited survey data by Bayesian Markov chain cosimulation.
Li, Weidong; Zhang, Chuanrong; Dey, Dipak K; Willig, Michael R
2013-01-01
Updating categorical soil maps is necessary for providing current, higher-quality soil data to agricultural and environmental management but may not require a costly thorough field survey because latest legacy maps may only need limited corrections. This study suggests a Markov chain random field (MCRF) sequential cosimulation (Co-MCSS) method for updating categorical soil maps using limited survey data provided that qualified legacy maps are available. A case study using synthetic data demonstrates that Co-MCSS can appreciably improve simulation accuracy of soil types with both contributions from a legacy map and limited sample data. The method indicates the following characteristics: (1) if a soil type indicates no change in an update survey or it has been reclassified into another type that similarly evinces no change, it will be simply reproduced in the updated map; (2) if a soil type has changes in some places, it will be simulated with uncertainty quantified by occurrence probability maps; (3) if a soil type has no change in an area but evinces changes in other distant areas, it still can be captured in the area with unobvious uncertainty. We concluded that Co-MCSS might be a practical method for updating categorical soil maps with limited survey data.
Updating Categorical Soil Maps Using Limited Survey Data by Bayesian Markov Chain Cosimulation
Dey, Dipak K.; Willig, Michael R.
2013-01-01
Updating categorical soil maps is necessary for providing current, higher-quality soil data to agricultural and environmental management but may not require a costly thorough field survey because latest legacy maps may only need limited corrections. This study suggests a Markov chain random field (MCRF) sequential cosimulation (Co-MCSS) method for updating categorical soil maps using limited survey data provided that qualified legacy maps are available. A case study using synthetic data demonstrates that Co-MCSS can appreciably improve simulation accuracy of soil types with both contributions from a legacy map and limited sample data. The method indicates the following characteristics: (1) if a soil type indicates no change in an update survey or it has been reclassified into another type that similarly evinces no change, it will be simply reproduced in the updated map; (2) if a soil type has changes in some places, it will be simulated with uncertainty quantified by occurrence probability maps; (3) if a soil type has no change in an area but evinces changes in other distant areas, it still can be captured in the area with unobvious uncertainty. We concluded that Co-MCSS might be a practical method for updating categorical soil maps with limited survey data. PMID:24027447
Kite Aerial Photography (KAP) as a Tool for Field Teaching
ERIC Educational Resources Information Center
Sander, Lasse
2014-01-01
Kite aerial photography (KAP) is proposed as a creative tool for geography field teaching and as a medium to approach the complexity of readily available geodata. The method can be integrated as field experiment, surveying technique or group activity. The acquired aerial images can instantaneously be integrated in geographic information systems…
Trident Technical College 1999 Graduate Follow-Up Report.
ERIC Educational Resources Information Center
Trident Technical Coll., Charleston, SC.
Presents the results of South Carolina's Trident Technical College's (TTC's) 1999 graduate follow-up survey report. Graduates were surveyed and results were obtained for the following items: graduate goals, employment, placement rates, graduates in related fields, when job obtained, job finding methods, job locations, job satisfaction, job…
Recovering the full velocity and density fields from large-scale redshift-distance samples
NASA Technical Reports Server (NTRS)
Bertschinger, Edmund; Dekel, Avishai
1989-01-01
A new method for extracting the large-scale three-dimensional velocity and mass density fields from measurements of the radial peculiar velocities is presented. Galaxies are assumed to trace the velocity field rather than the mass. The key assumption made is that the Lagrangian velocity field has negligible vorticity, as might be expected from perturbations that grew by gravitational instability. By applying the method to cosmological N-body simulations, it is demonstrated that it accurately reconstructs the velocity field. This technique promises a direct determination of the mass density field and the initial conditions for the formation of large-scale structure from galaxy peculiar velocity surveys.
Comparison of Satellite Surveying to Traditional Surveying Methods for the Resources Industry
NASA Astrophysics Data System (ADS)
Osborne, B. P.; Osborne, V. J.; Kruger, M. L.
Modern ground-based survey methods involve detailed survey, which provides three-space co-ordinates for surveyed points, to a high level of accuracy. The instruments are operated by surveyors, who process the raw results to create survey location maps for the subject of the survey. Such surveys are conducted for a location or region and referenced to the earth global co- ordinate system with global positioning system (GPS) positioning. Due to this referencing the survey is only as accurate as the GPS reference system. Satellite survey remote sensing utilise satellite imagery which have been processed using commercial geographic information system software. Three-space co-ordinate maps are generated, with an accuracy determined by the datum position accuracy and optical resolution of the satellite platform.This paper presents a case study, which compares topographic surveying undertaken by traditional survey methods with satellite surveying, for the same location. The purpose of this study is to assess the viability of satellite remote sensing for surveying in the resources industry. The case study involves a topographic survey of a dune field for a prospective mining project area in Pakistan. This site has been surveyed using modern surveying techniques and the results are compared to a satellite survey performed on the same area.Analysis of the results from traditional survey and from the satellite survey involved a comparison of the derived spatial co- ordinates from each method. In addition, comparisons have been made of costs and turnaround time for both methods.The results of this application of remote sensing is of particular interest for survey in areas with remote and extreme environments, weather extremes, political unrest, poor travel links, which are commonly associated with mining projects. Such areas frequently suffer language barriers, poor onsite technical support and resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This case history discusses the exploration methods used at the Momotombo Geothermal Field in western Nicaragua, and evaluates their contributions to the development of the geothermal field models. Subsequent reservoir engineering has not been synthesized or evaluated. A geothermal exploration program was started in Nicaragua in 1966 to discover and delineate potential geothermal reservoirs in western Nicaragua. Exploration began at the Momotombo field in 1970 using geological, geochemical, and geophysical methods. A regional study of thermal manifestations was undertaken and the area on the southern flank of Volcan Momotombo was chosen for more detailed investigation. Subsequent exploration by various consultantsmore » produced a number of geotechnical reports on the geology, geophysics, and geochemistry of the field as well as describing production well drilling. Geological investigations at Momotombo included photogeology, field mapping, binocular microscope examination of cuttings, and drillhole correlations. Among the geophysical techniques used to investigate the field sub-structure were: Schlumberger and electromagnetic soundings, dipole mapping and audio-magnetotelluric surveys, gravity and magnetic measurements, frequency domain soundings, self-potential surveys, and subsurface temperature determinations. The geochemical program analyzed the thermal fluids of the surface and in the wells. This report presents the description and results of exploration methods used during the investigative stages of the Momotombo Geothermal Field. A conceptual model of the geothermal field was drawn from the information available at each exploration phase. The exploration methods have been evaluated with respect to their contributions to the understanding of the field and their utilization in planning further development. Our principal finding is that data developed at each stage were not sufficiently integrated to guide further work at the field, causing inefficient use of resources.« less
Trident Technical College 1998 Graduate Follow-Up.
ERIC Educational Resources Information Center
Trident Technical Coll., Charleston, SC.
Presents the results of South Carolina's Trident Technical College's (TTC's) 1998 graduate follow-up survey report of 915 TTC graduates. Graduates were surveyed and results were obtained for the following items: graduate goals, employment, placement rates, graduates in related fields, when job were obtained, job finding methods, job locations, job…
Survival analysis, or what to do with upper limits in astronomical surveys
NASA Technical Reports Server (NTRS)
Isobe, Takashi; Feigelson, Eric D.
1986-01-01
A field of applied statistics called survival analysis has been developed over several decades to deal with censored data, which occur in astronomical surveys when objects are too faint to be detected. How these methods can assist in the statistical interpretation of astronomical data are reviewed.
75 FR 57253 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-20
....S. Census Bureau. Title: 2011 Field Test of the Re-Engineered Survey of Income and Program...-engineered Survey of Income and Program Participation (SIPP). The Census Bureau's SIPP CAPI interview will use an event history calendar (EHC) interviewing method and a 12-month, calendar-year reference period...
76 FR 51939 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-19
....S. Census Bureau. Title: 2012 Survey of Income and Program Participation Event History Calendar... conduct the 2012 Survey of Income and Program Participation Event History Calendar (SIPP-EHC) Field Test... Calendar (EHC) interviewing method and a 12-month, calendar-year reference period in place of the current...
Degnan, James R.; Moore, Richard Bridge; Mack, Thomas J.
2001-01-01
Bedrock-fracture zones near high-yield bedrock wells in southern New Hampshire well fields were located and characterized using seven surface and six borehole geophysical survey methods. Detailed surveys of six sites with various methods provide an opportunity to integrate and compare survey results. Borehole geophysical surveys were conducted at three of the sites to confirm subsurface features. Hydrogeologic settings, including a variety of bedrock and surface geologic materials, were sought to gain an insight into the usefulness of the methods in varied terrains. Results from 15 survey lines, 8 arrays, and 3 boreholes were processed and interpreted from the 6 sites. The surface geophysical methods used provided physical properties of fractured bedrock. Seismic refraction and ground-penetrating radar (GPR) primarily were used to characterize the overburden materials, but in a few cases indicated bedrock-fracture zones. Magnetometer surveys were used to obtain background information about the bedrock to compare with other results, and to search for magnetic lows, which may result from weathered fractured rock. Electromagnetic terrain conductivity surveys (EM) and very-low-frequency electromagnetic surveys (VLF) were used as rapid reconnaissance techniques with the primary purpose of identifying electrical anomalies, indicating potential fracture zones in bedrock. Direct-current (dc) resistivity methods were used to gather detailed subsurface information about fracture depth and orientation. Two-dimensional (2-D) dc-resistivity surveys using dipole-dipole and Schlumberger arrays located and characterized the overburden, bedrock, and bedrock-fracture zones through analysis of data inversions. Azimuthal square array dc-resistivity survey results indicated orientations of conductive steep-dipping bedrock-fracture zones that were located and characterized by previously applied geophysical methods. Various available data sets were used for site selection, characterizations, and interpretations. Lineament data, developed as a part of a statewide and regional scale investigation of the bedrock aquifer, were available to identify potential near-vertical fracture zones. Geophysical surveys indicated fracture zones coincident with lineaments at 4 of the sites. Geologic data collected as a part of the regional scale investigation provided outcrop fracture measurements, ductile fabric, and contact information. Dominant fracture trends correspond to the trends of geophysical anomalies at 4 of the sites. Water-well drillers? logs from water supply and environmental data sets also were used where available to characterize sites. Regional overburden information was compiled from stratified-drift aquifer maps and surficial-geological maps.
Khan, Shane M; Bain, Robert E S; Lunze, Karsten; Unalan, Turgay; Beshanski-Pedersen, Bo; Slaymaker, Tom; Johnston, Richard; Hancioglu, Attila
2017-01-01
The Sustainable Development Goals (SDGs) require household survey programmes such as the UNICEF-supported Multiple Indicator Cluster Surveys (MICS) to enhance data collection to cover new indicators. This study aims to evaluated methods for assessing water quality, water availability, emptying of sanitation facilities, menstrual hygiene management and the acceptability of water quality testing in households which are key to monitoring SDG targets 6.1 and 6.2 on drinking Water, Sanitation and Hygiene (WASH) and emerging issues. As part of a MICS field test, we interviewed 429 households and 267 women age 15-49 in Stann Creek, Belize in a split-sample experiment. In a concurrent qualitative component, we conducted focus groups with interviewers and cognitive interviews with respondents during and immediately following questionnaire administration in the field to explore their question comprehension and response processes. About 88% of respondents agreed to water quality testing but also desired test results, given the potential implications for their own health. Escherichia coli was present in 36% of drinking water collected at the source, and in 47% of samples consumed in the household. Both questions on water availability necessitated probing by interviewers. About one quarter of households reported emptying of pit latrines and septic tanks, though one-quarter could not provide an answer to the question. Asking questions on menstrual hygiene was acceptable to respondents, but required some clarification and probing. In the context of Belize, this study confirmed the feasibility of collecting information on the availability and quality of drinking water, emptying of sanitation facilities and menstrual hygiene in a multi-purpose household survey, indicating specific areas to improve question formulation and field protocols. Improvements have been incorporated into the latest round of MICS surveys which will be a major source of national data for monitoring of SDG targets for drinking water, sanitation and hygiene and emerging issues for WASH sector programming.
A Methodological Intercomparison of Topographic and Aerial Photographic Habitat Survey Techniques
NASA Astrophysics Data System (ADS)
Bangen, S. G.; Wheaton, J. M.; Bouwes, N.
2011-12-01
A severe decline in Columbia River salmonid populations and subsequent Federal listing of subpopulations has mandated both the monitoring of populations and evaluation of the status of available habitat. Numerous field and analytical methods exist to assist in the quantification of the abundance and quality of in-stream habitat for salmonids. These methods range from field 'stick and tape' surveys to spatially explicit topographic and aerial photographic surveys from a mix of ground-based and remotely sensed airborne platforms. Although several previous studies have assessed the quality of specific individual survey methods, the intercomparison of competing techniques across a diverse range of habitat conditions (wadeable headwater channels to non-wadeable mainstem channels) has not yet been elucidated. In this study, we seek to enumerate relative quality (i.e. accuracy, precision, extent) of habitat metrics and inventories derived from an array of ground-based and remotely sensed surveys of varying degrees of sophistication, as well as quantify the effort and cost in conducting the surveys. Over the summer of 2010, seven sample reaches of varying habitat complexity were surveyed in the Lemhi River Basin, Idaho, USA. Complete topographic surveys were attempted at each site using rtkGPS, total station, ground-based LiDaR and traditional airborne LiDaR. Separate high spatial resolution aerial imagery surveys were acquired using a tethered blimp, a drone UAV, and a traditional fixed-wing aircraft. Here we also developed a relatively simplistic methodology for deriving bathymetry from aerial imagery that could be readily employed by instream habitat monitoring programs. The quality of bathymetric maps derived from aerial imagery was compared with rtkGPS topographic data. The results are helpful for understanding the strengths and weaknesses of different approaches in specific conditions, and how a hybrid of data acquisition methods can be used to build a more complete quantification of salmonid habitat conditions in streams.
NASA Technical Reports Server (NTRS)
Campbell, M. E.
1972-01-01
A survey is presented of the most recent developments and trends in the field of solid lubrication. Topics discussed include: a history of solid lubrication, lubricating solids, bonded lubricants, new developments, methods of evaluation, environmental effects, application methods, novel materials, and designs for the use of solid lubricants. Excerpts of solid lubricant specifications and a discussion of contact stresses imposed on specimens in three types of test machines used for the evaluation of solid lubricants are presented.
Bain, Robert E. S.; Lunze, Karsten; Unalan, Turgay; Beshanski-Pedersen, Bo; Slaymaker, Tom; Johnston, Richard; Hancioglu, Attila
2017-01-01
Background The Sustainable Development Goals (SDGs) require household survey programmes such as the UNICEF-supported Multiple Indicator Cluster Surveys (MICS) to enhance data collection to cover new indicators. This study aims to evaluated methods for assessing water quality, water availability, emptying of sanitation facilities, menstrual hygiene management and the acceptability of water quality testing in households which are key to monitoring SDG targets 6.1 and 6.2 on drinking Water, Sanitation and Hygiene (WASH) and emerging issues. Methods As part of a MICS field test, we interviewed 429 households and 267 women age 15–49 in Stann Creek, Belize in a split-sample experiment. In a concurrent qualitative component, we conducted focus groups with interviewers and cognitive interviews with respondents during and immediately following questionnaire administration in the field to explore their question comprehension and response processes. Findings About 88% of respondents agreed to water quality testing but also desired test results, given the potential implications for their own health. Escherichia coli was present in 36% of drinking water collected at the source, and in 47% of samples consumed in the household. Both questions on water availability necessitated probing by interviewers. About one quarter of households reported emptying of pit latrines and septic tanks, though one-quarter could not provide an answer to the question. Asking questions on menstrual hygiene was acceptable to respondents, but required some clarification and probing. Conclusions In the context of Belize, this study confirmed the feasibility of collecting information on the availability and quality of drinking water, emptying of sanitation facilities and menstrual hygiene in a multi-purpose household survey, indicating specific areas to improve question formulation and field protocols. Improvements have been incorporated into the latest round of MICS surveys which will be a major source of national data for monitoring of SDG targets for drinking water, sanitation and hygiene and emerging issues for WASH sector programming. PMID:29216244
Magnetic space-based field measurements
NASA Technical Reports Server (NTRS)
Langel, R. A.
1981-01-01
Satellite measurements of the geomagnetic field began with the launch of Sputnik 3 in May 1958 and have continued sporadically in the intervening years. A list of spacecraft that have made significant contributions to an understanding of the near-earth geomagnetic field is presented. A new era in near-earth magnetic field measurements began with NASA's launch of Magsat in October 1979. Attention is given to geomagnetic field modeling, crustal magnetic anomaly studies, and investigations of the inner earth. It is concluded that satellite-based magnetic field measurements make global surveys practical for both field modeling and for the mapping of large-scale crustal anomalies. They are the only practical method of accurately modeling the global secular variation. Magsat is providing a significant contribution, both because of the timeliness of the survey and because its vector measurement capability represents an advance in the technology of such measurements.
Methods of instruction of the incident command system and related topics at US veterinary schools.
Smith, Joe S; Kuldau, Gretchen A
2014-12-01
The Incident Command System (ICS) is an adaptable construct designed to streamline response efforts to a disaster or other incident. We aimed to examine the methods used to teach the ICS at US veterinary schools and to explore alternative and novel methods for instruction of this material. A total of 29 US accredited veterinary schools (as of February 2012) were surveyed, and 18 of the 29 schools responded. The ICS and related topics were taught by both classroom methods and online instruction by most of the surveyed schools. Several of the schools used readily available Federal Emergency Management Agency and US Department of Agriculture resources to aid in instruction. Most schools used one course to teach the ICS, and some schools also used unique methods such as field exercises, drills, side-by-side training with disaster response teams, elective courses, extracurricular clubs, and externships to reinforce the ICS and related topics. Some of the surveyed institutions also utilized fourth-year clinical rotations and field deployments during actual disasters as a component of their ICS and emergency response curriculum. The ICS is being taught at some form at a significant number of US veterinary schools. Additional research is needed to evaluate the efficacy of the teaching methods of the ICS in US veterinary schools.
NASA Astrophysics Data System (ADS)
Kiflu, H.; Kruse, S.; Loke, M. H.; Wilkinson, P. B.; Harro, D.
2016-12-01
Electrical resistivity tomography (ERT) surveys are widely used in geological, environmental and engineering studies. However, the effectiveness of surface ERT surveys is limited by decreasing resolution with depth and near the ends of the survey line. Increasing the array length will increase depth of investigation, but may not be possible at urban sites where access is limited. One novel method of addressing these limitations while maintaining lateral coverage is to install an array of deep electrodes. Referred to here as the Multi-Electrode Resistivity Implant Technique (MERIT), self-driving pointed electrodes are implanted at depth below each surface electrode in an array, using direct-push technology. Optimal sequences of readings have been identified with the "Compare R" method of Wilkinson. Numerical, laboratory, and field case studies are applied to examine the effectiveness of the MERIT method, particularly for use in covered karst terrain. In the field case studies, resistivity images are compared against subsurface structure defined from borings, GPR surveys, and knowledge of prior land use. In karst terrain where limestone has a clay overburden, traditional surface resistivity methods suffer from lack of current penetration through the shallow clay layer. In these settings, the MERIT method is found to improve resolution of features between the surface and buried array, as well as increasing depth of penetration and enhancing imaging capabilities at the array ends. The method functions similar to a cross-borehole array between horizontal boreholes, and suffers from limitations common to borehole arrays. Inversion artifacts are common at depths close to the buried array, and because some readings involve high geometric factors, inversions are more susceptible to noise than traditional surface arrays. Results are improved by using errors from reciprocal measurements to weight the data during the inversion.
Radio Galaxy Zoo: Machine learning for radio source host galaxy cross-identification
NASA Astrophysics Data System (ADS)
Alger, M. J.; Banfield, J. K.; Ong, C. S.; Rudnick, L.; Wong, O. I.; Wolf, C.; Andernach, H.; Norris, R. P.; Shabala, S. S.
2018-05-01
We consider the problem of determining the host galaxies of radio sources by cross-identification. This has traditionally been done manually, which will be intractable for wide-area radio surveys like the Evolutionary Map of the Universe (EMU). Automated cross-identification will be critical for these future surveys, and machine learning may provide the tools to develop such methods. We apply a standard approach from computer vision to cross-identification, introducing one possible way of automating this problem, and explore the pros and cons of this approach. We apply our method to the 1.4 GHz Australian Telescope Large Area Survey (ATLAS) observations of the Chandra Deep Field South (CDFS) and the ESO Large Area ISO Survey South 1 (ELAIS-S1) fields by cross-identifying them with the Spitzer Wide-area Infrared Extragalactic (SWIRE) survey. We train our method with two sets of data: expert cross-identifications of CDFS from the initial ATLAS data release and crowdsourced cross-identifications of CDFS from Radio Galaxy Zoo. We found that a simple strategy of cross-identifying a radio component with the nearest galaxy performs comparably to our more complex methods, though our estimated best-case performance is near 100 per cent. ATLAS contains 87 complex radio sources that have been cross-identified by experts, so there are not enough complex examples to learn how to cross-identify them accurately. Much larger datasets are therefore required for training methods like ours. We also show that training our method on Radio Galaxy Zoo cross-identifications gives comparable results to training on expert cross-identifications, demonstrating the value of crowdsourced training data.
Cluster candidates around low-power radio galaxies at z ∼ 1-2 in cosmos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castignani, G.; Celotti, A.; De Zotti, G.
2014-09-10
We search for high-redshift (z ∼1-2) galaxy clusters using low power radio galaxies (FR I) as beacons and our newly developed Poisson probability method based on photometric redshift information and galaxy number counts. We use a sample of 32 FR Is within the Cosmic Evolution Survey (COSMOS) field from the Chiaberge et al. catalog. We derive a reliable subsample of 21 bona fide low luminosity radio galaxies (LLRGs) and a subsample of 11 high luminosity radio galaxies (HLRGs), on the basis of photometric redshift information and NRAO VLA Sky Survey radio fluxes. The LLRGs are selected to have 1.4 GHzmore » rest frame luminosities lower than the fiducial FR I/FR II divide. This also allows us to estimate the comoving space density of sources with L {sub 1.4} ≅ 10{sup 32.3} erg s{sup –1} Hz{sup –1} at z ≅ 1.1, which strengthens the case for a strong cosmological evolution of these sources. In the fields of the LLRGs and HLRGs we find evidence that 14 and 8 of them reside in rich groups or galaxy clusters, respectively. Thus, overdensities are found around ∼70% of the FR Is, independently of the considered subsample. This rate is in agreement with the fraction found for low redshift FR Is and it is significantly higher than that for FR IIs at all redshifts. Although our method is primarily introduced for the COSMOS survey, it may be applied to both present and future wide field surveys such as Sloan Digital Sky Survey Stripe 82, LSST, and Euclid. Furthermore, cluster candidates found with our method are excellent targets for next generation space telescopes such as James Webb Space Telescope.« less
Bathymetric surveying with GPS and heave, pitch, and roll compensation
Work, P.A.; Hansen, M.; Rogers, W.E.
1998-01-01
Field and laboratory tests of a shipborne hydrographic survey system were conducted. The system consists of two 12-channel GPS receivers (one on-board, one fixed on shore), a digital acoustic fathometer, and a digital heave-pitch-roll (HPR) recorder. Laboratory tests of the HPR recorder and fathometer are documented. Results of field tests of the isolated GPS system and then of the entire suite of instruments are presented. A method for data reduction is developed to account for vertical errors introduced by roll and pitch of the survey vessel, which can be substantial (decimeters). The GPS vertical position data are found to be reliable to 2-3 cm and the fathometer to 5 cm in the laboratory. The field test of the complete system in shallow water (<2 m) indicates absolute vertical accuracy of 10-20 cm. Much of this error is attributed to the fathometer. Careful surveying and equipment setup can minimize systematic error and yield much smaller average errors.
Modular neural networks: a survey.
Auda, G; Kamel, M
1999-04-01
Modular Neural Networks (MNNs) is a rapidly growing field in artificial Neural Networks (NNs) research. This paper surveys the different motivations for creating MNNs: biological, psychological, hardware, and computational. Then, the general stages of MNN design are outlined and surveyed as well, viz., task decomposition techniques, learning schemes and multi-module decision-making strategies. Advantages and disadvantages of the surveyed methods are pointed out, and an assessment with respect to practical potential is provided. Finally, some general recommendations for future designs are presented.
Environmental projects. Volume 4: Asbestos survey
NASA Technical Reports Server (NTRS)
Kushner, L.
1988-01-01
The Goldstone Deep Space Communications Complex (GDSCC), near Barstow, California, operates in support of six large parabolic dish antennas. Many of the buildings and structures at the GDSCC were erected before it became known that asbestos posed a hazard to human health. Thus, because of concern with asbestos, two field surveys were conducted at the GDSCC in October/November 1986 and in September 1987 to locate, classify, and quantify all asbestos-containing materials in buildings, structures, roofs and boilers. The report describes the results of the two surveys and describes methods for both asbestos management and asbestos abatement. The surveys found that GDSCC practices involving asbestos are conscientious and forward-thinking. A program, due to start in FY 1988 and to be completed in FY 1990, is planned to remove all friable (easily pulverized) asbestos-containing materials discovered during the two field surveys for asbestos at the GDSCC.
NASA Astrophysics Data System (ADS)
Rufino, Marta M.; Baptista, Paulo; Pereira, Fábio; Gaspar, Miguel B.
2018-01-01
In the current work we propose a new method to sample surface sediment during bivalve fishing surveys. Fishing institutes all around the word carry out regular surveys with the aim of monitoring the stocks of commercial species. These surveys comprise often more than one hundred of sampling stations and cover large geographical areas. Although superficial sediment grain sizes are among the main drivers of benthic communities and provide crucial information for studies on coastal dynamics, overall there is a strong lack of this type of data, possibly, because traditional surface sediment sampling methods use grabs, that require considerable time and effort to be carried out on regular basis or on large areas. In face of these aspects, we developed an easy and un-expensive method to sample superficial sediments, during bivalve fisheries monitoring surveys, without increasing survey time or human resources. The method was successfully evaluated and validated during a typical bivalve survey carried out on the Northwest coast of Portugal, confirming that it had any interference with the survey objectives. Furthermore, the method was validated by collecting samples using a traditional Van Veen grabs (traditional method), which showed a similar grain size composition to the ones collected by the new method, on the same localities. We recommend that the procedure is implemented on regular bivalve fishing surveys, together with an image analysis system to analyse the collected samples. The new method will provide substantial quantity of data on surface sediment in coastal areas, using a non-expensive and efficient manner, with a high potential application in different fields of research.
Employing broadband spectra and cluster analysis to assess thermal defoliation of cotton
USDA-ARS?s Scientific Manuscript database
Growers and field scouts need assistance in surveying cotton (Gossypium hirsutum L.) fields subjected to thermal defoliation to reap the benefits provided by this nonchemical defoliation method. A study was conducted to evaluate broadband spectral data and unsupervised classification as tools for s...
Until recently, lake physical habitat assessment has been an underemployed tool for assessing lake and reservoir ecological condition. We outline and evaluate a rapid field sampling and analytical approach for quantifying near-shore physical habitat. We quantified the repeatabil...
NASA Astrophysics Data System (ADS)
Lacki, Brian C.; Kochanek, Christopher S.; Stanek, Krzysztof Z.; Inada, Naohisa; Oguri, Masamune
2009-06-01
Difference imaging provides a new way to discover gravitationally lensed quasars because few nonlensed sources will show spatially extended, time variable flux. We test the method on the fields of lens candidates in the Sloan Digital Sky Survey (SDSS) Supernova Survey region from the SDSS Quasar Lens Search (SQLS) and one serendipitously discovered lensed quasar. Starting from 20,536 sources, including 49 SDSS quasars, 32 candidate lenses/lensed images, and one known lensed quasar, we find that 174 sources including 35 SDSS quasars, 16 candidate lenses/lensed images, and the known lensed quasar are nonperiodic variable sources. We can measure the spatial structure of the variable flux for 119 of these variable sources and identify only eight as candidate extended variables, including the known lensed quasar. Only the known lensed quasar appears as a close pair of sources on the difference images. Inspection of the remaining seven suggests they are false positives, and only two were spectroscopically identified quasars. One of the lens candidates from the SQLS survives our cuts, but only as a single image instead of a pair. This indicates a false positive rate of order ~1/4000 for the method, or given our effective survey area of order 0.82 deg2, ~5 per deg2 in the SDSS Supernova Survey. The fraction of quasars not found to be variable and the false positive rate would both fall if we had analyzed the full, later data releases for the SDSS fields. While application of the method to the SDSS is limited by the resolution, depth, and sampling of the survey, several future surveys such as Pan-STARRS, LSST, and SNAP will significantly improve on these limitations.
Assessment of soil compaction properties based on surface wave techniques
NASA Astrophysics Data System (ADS)
Jihan Syamimi Jafri, Nur; Rahim, Mohd Asri Ab; Zahid, Mohd Zulham Affandi Mohd; Faizah Bawadi, Nor; Munsif Ahmad, Muhammad; Faizal Mansor, Ahmad; Omar, Wan Mohd Sabki Wan
2018-03-01
Soil compaction plays an important role in every construction activities to reduce risks of any damage. Traditionally, methods of assessing compaction include field tests and invasive penetration tests for compacted areas have great limitations, which caused time-consuming in evaluating large areas. Thus, this study proposed the possibility of using non-invasive surface wave method like Multi-channel Analysis of Surface Wave (MASW) as a useful tool for assessing soil compaction. The aim of this study was to determine the shear wave velocity profiles and field density of compacted soils under varying compaction efforts by using MASW method. Pre and post compaction of MASW survey were conducted at Pauh Campus, UniMAP after applying rolling compaction with variation of passes (2, 6 and 10). Each seismic data was recorded by GEODE seismograph. Sand replacement test was conducted for each survey line to obtain the field density data. All seismic data were processed using SeisImager/SW software. The results show the shear wave velocity profiles increase with the number of passes from 0 to 6 passes, but decrease after 10 passes. This method could attract the interest of geotechnical community, as it can be an alternative tool to the standard test for assessing of soil compaction in the field operation.
Field evaluation of distance-estimation error during wetland-dependent bird surveys
Nadeau, Christopher P.; Conway, Courtney J.
2012-01-01
Context: The most common methods to estimate detection probability during avian point-count surveys involve recording a distance between the survey point and individual birds detected during the survey period. Accurately measuring or estimating distance is an important assumption of these methods; however, this assumption is rarely tested in the context of aural avian point-count surveys. Aims: We expand on recent bird-simulation studies to document the error associated with estimating distance to calling birds in a wetland ecosystem. Methods: We used two approaches to estimate the error associated with five surveyor's distance estimates between the survey point and calling birds, and to determine the factors that affect a surveyor's ability to estimate distance. Key results: We observed biased and imprecise distance estimates when estimating distance to simulated birds in a point-count scenario (x̄error = -9 m, s.d.error = 47 m) and when estimating distances to real birds during field trials (x̄error = 39 m, s.d.error = 79 m). The amount of bias and precision in distance estimates differed among surveyors; surveyors with more training and experience were less biased and more precise when estimating distance to both real and simulated birds. Three environmental factors were important in explaining the error associated with distance estimates, including the measured distance from the bird to the surveyor, the volume of the call and the species of bird. Surveyors tended to make large overestimations to birds close to the survey point, which is an especially serious error in distance sampling. Conclusions: Our results suggest that distance-estimation error is prevalent, but surveyor training may be the easiest way to reduce distance-estimation error. Implications: The present study has demonstrated how relatively simple field trials can be used to estimate the error associated with distance estimates used to estimate detection probability during avian point-count surveys. Evaluating distance-estimation errors will allow investigators to better evaluate the accuracy of avian density and trend estimates. Moreover, investigators who evaluate distance-estimation errors could employ recently developed models to incorporate distance-estimation error into analyses. We encourage further development of such models, including the inclusion of such models into distance-analysis software.
Developing a prediction model for Armillaria solidipes in Arizona
N. B. Klopfenstein; J. W Hanna; M. L. Fairweather; J. D. Shaw; R. Mathiasen; C. Hoffman; E. Nelson; M. -S. Kim; A. L. Ross-Davis
2012-01-01
In 2010, a collaborative project was started to determine the distribution of Armillaria solidipes (= A. ostoyae) in Arizona. The methods and preliminary accomplishments of the 2010 and 2011 (ongoing) field surveys/collections are summarized. During the next phase of this project, surveys will be completed and remaining Armillaria isolates will be identified using DNA-...
Factors Affecting Student Retention at One Independent School in the Southwest
ERIC Educational Resources Information Center
Ahlstrom, Dan Roger
2013-01-01
This mixed-methods case study determined the factors and examined the issues associated with student retention at a faith-based independent day school in southwestern United States of America. The data included online surveys, personal interviews, collection of archival information, and the researcher's extensive field notes. Surveys (530) were…
The SAMI Galaxy Survey: can we trust aperture corrections to predict star formation?
NASA Astrophysics Data System (ADS)
Richards, S. N.; Bryant, J. J.; Croom, S. M.; Hopkins, A. M.; Schaefer, A. L.; Bland-Hawthorn, J.; Allen, J. T.; Brough, S.; Cecil, G.; Cortese, L.; Fogarty, L. M. R.; Gunawardhana, M. L. P.; Goodwin, M.; Green, A. W.; Ho, I.-T.; Kewley, L. J.; Konstantopoulos, I. S.; Lawrence, J. S.; Lorente, N. P. F.; Medling, A. M.; Owers, M. S.; Sharp, R.; Sweet, S. M.; Taylor, E. N.
2016-01-01
In the low-redshift Universe (z < 0.3), our view of galaxy evolution is primarily based on fibre optic spectroscopy surveys. Elaborate methods have been developed to address aperture effects when fixed aperture sizes only probe the inner regions for galaxies of ever decreasing redshift or increasing physical size. These aperture corrections rely on assumptions about the physical properties of galaxies. The adequacy of these aperture corrections can be tested with integral-field spectroscopic data. We use integral-field spectra drawn from 1212 galaxies observed as part of the SAMI Galaxy Survey to investigate the validity of two aperture correction methods that attempt to estimate a galaxy's total instantaneous star formation rate. We show that biases arise when assuming that instantaneous star formation is traced by broad-band imaging, and when the aperture correction is built only from spectra of the nuclear region of galaxies. These biases may be significant depending on the selection criteria of a survey sample. Understanding the sensitivities of these aperture corrections is essential for correct handling of systematic errors in galaxy evolution studies.
Lim, Jacqueline Kyungah; Carabali, Mabel; Lee, Jung-Seok; Lee, Kang-Sung; Namkung, Suk; Lim, Sl-Ki; Ridde, Valéry; Fernandes, Jose; Lell, Bertrand; Matendechero, Sultani Hadley; Esen, Meral; Andia, Esther; Oyembo, Noah; Barro, Ahmed; Bonnet, Emmanuel; Njenga, Sammy M; Agnandji, Selidji Todagbe; Yaro, Seydou; Alexander, Neal; Yoon, In-Kyu
2018-01-01
Introduction Dengue is an important and well-documented public health problem in the Asia-Pacific and Latin American regions. However, in Africa, information on disease burden is limited to case reports and reports of sporadic outbreaks, thus hindering the implementation of public health actions for disease control. To gather evidence on the undocumented burden of dengue in Africa, epidemiological studies with standardised methods were launched in three locations in Africa. Methods and analysis In 2014–2017, the Dengue Vaccine Initiative initiated field studies at three sites in Ouagadougou, Burkina Faso; Lambaréné, Gabon and Mombasa, Kenya to obtain comparable incidence data on dengue and assess its burden through standardised hospital-based surveillance and community-based serological methods. Multidisciplinary measurements of the burden of dengue were obtained through field studies that included passive facility-based fever surveillance, cost-of-illness surveys, serological surveys and healthcare utilisation surveys. All three sites conducted case detection using standardised procedures with uniform laboratory assays to diagnose dengue. Healthcare utilisation surveys were conducted to adjust population denominators in incidence calculations for differing healthcare seeking patterns. The fever surveillance data will allow calculation of age-specific incidence rates and comparison of symptomatic presentation between patients with dengue and non-dengue using multivariable logistic regression. Serological surveys assessed changes in immune status of cohorts of approximately 3000 randomly selected residents at each site at 6-month intervals. The age-stratified serosurvey data will allow calculation of seroprevalence and force of infection of dengue. Cost-of-illness evaluations were conducted among patients with acute dengue by Rapid Diagnostic Test. Ethics and dissemination By standardising methods to evaluate dengue burden across several sites in Africa, these studies will generate evidence for dengue burden in Africa and data will be disseminated as publication in peer-review journals in 2018. PMID:29358421
ERIC Educational Resources Information Center
Saunders, Daniel B.; Kolek, Ethan A.; Williams, Elizabeth A.; Wells, Ryan S.
2016-01-01
Previous research has found the field of higher education, particularly in the United States, is dominated by functionalist approaches, a preponderance of survey data, and the ubiquitous use of advanced quantitative methods to investigate educational phenomena. This descriptive study aims to illuminate why the field is constructed in this way.…
The Relationship of Field of Study to Current Smoking Status among College Students
ERIC Educational Resources Information Center
Berg, Carla J.; Klatt, Colleen M.; Thomas, Janet L.; Ahluwalia, Jasjit S.; An, Lawrence C.
2009-01-01
Problem: No research to date has examined smoking rates among the different fields of study and smoking among college students. Thus, this study aimed to determine if smoking prevalence vary among students in the different fields of study. Method: An online health behavior survey was administered to 25,000 students (n=6,492; 26% response rate).…
Magnetic profiling of the San Andreas Fault using a dual magnetometer UAV aerial survey system.
NASA Astrophysics Data System (ADS)
Abbate, J. A.; Angelopoulos, V.; Masongsong, E. V.; Yang, J.; Medina, H. R.; Moon, S.; Davis, P. M.
2017-12-01
Aeromagnetic survey methods using planes are more time-effective than hand-held methods, but can be far more expensive per unit area unless large areas are covered. The availability of low cost UAVs and low cost, lightweight fluxgate magnetometers (FGMs) allows, with proper offset determination and stray fields correction, for low-cost magnetic surveys. Towards that end, we have developed a custom multicopter UAV for magnetic mapping using a dual 3-axis fluxgate magnetometer system: the GEOphysical Drone Enhanced Survey Instrument (GEODESI). A high precision sensor measures the UAV's position and attitude (roll, pitch, and yaw) and is recorded using a custom Arduino data processing system. The two FGMs (in-board and out-board) are placed on two ends of a vertical 1m boom attached to the base of the UAV. The in-board FGM is most sensitive to stray fields from the UAV and its signal is used, after scaling, to clean the signal of the out-board FGM from the vehicle noise. The FGMs record three orthogonal components of the magnetic field in the UAV body coordinates which are then transformed into a north-east-down coordinate system using a rotation matrix determined from the roll-pitch-yaw attitude data. This ensures knowledge of the direction of all three field components enabling us to perform inverse modeling of magnetic anomalies with greater accuracy than total or vertical field measurements used in the past. Field tests were performed at Dragon's Back Pressure Ridge in the Carrizo Plain of California, where there is a known crossing of the San Andreas Fault. Our data and models were compared to previously acquired LiDAR and hand-held magnetometer measurements. Further tests will be carried out to solidify our results and streamline our processing for educational use in the classroom and student field training.
Underwater Acoustic Target Tracking: A Review
Han, Ying; Fan, Liying
2018-01-01
Advances in acoustic technology and instrumentation now make it possible to explore marine resources. As a significant component of ocean exploration, underwater acoustic target tracking has aroused wide attention both in military and civil fields. Due to the complexity of the marine environment, numerous techniques have been proposed to obtain better tracking performance. In this paper, we survey over 100 papers ranging from innovative papers to the state-of-the-art in this field to present underwater tracking technologies. Not only the related knowledge of acoustic tracking instrument and tracking progress is clarified in detail, but also a novel taxonomy method is proposed. In this paper, algorithms for underwater acoustic target tracking are classified based on the methods used as: (1) instrument-assisted methods; (2) mode-based methods; (3) tracking optimization methods. These algorithms are compared and analyzed in the aspect of dimensions, numbers, and maneuvering of the tracking target, which is different from other survey papers. Meanwhile, challenges, countermeasures, and lessons learned are illustrated in this paper. PMID:29301318
Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey
NASA Astrophysics Data System (ADS)
Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin
2018-04-01
Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f-v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.
Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey
NASA Astrophysics Data System (ADS)
Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin
2018-07-01
Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f- v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.
NASA Astrophysics Data System (ADS)
Zhou, Q.
2015-12-01
Although three-dimensional (3-D) electrical resistivity tomography (ERT) survey has become a popular practice in the site characterization and process monitoring, the two-dimensional (2-D) ERT survey is still often used in the field. This is because that the 2-D ERT survey is relatively easy to do and the focus of site characterization is on the information of 2-D cross section, not necessarily of the 3-D subsurface structure. Examples of such practice include tunnel line and crossing fault survey. In these cases, depending on the property of surface soil to be surveyed, the 2-D ERT survey with pole-pole array may occasionally make us obtain quality good data, however it often gives us a suit of data set both with real and erroneous ones that incorporated the effects of electrode contact and not far enough far electrodes. Without preprocessing, the apparent resistivity pseudo-section constructed from this kind of data set may quite deviate from the real one and the information obtained from it may be misleading and even completely incorrect. In this study, we developed a method of far electrode dynamic correction that is appropriate for raw data preprocessing from 2-D pole-pole array ERT survey. Based on this method, we not only can find and delete the abnormal data points easily, but also can position the coordinates of far electrodes actually working in the field, thus delete the far electrode effects and make best use of the looked strange data points. The method also makes us to be able to judge the effects of electrode contact and avoid using such data points in the following apparent resistivity pseudo-section construction. With this preprocessing to the data set, the constructed apparent resistivity pseudo-section is demonstrated to be more approximate to the real one. This makes the following reversion calculation more robust. We'll introduce this far electrode dynamic correction method and show application examples in the meeting.
The T dwarf population in the UKIDSS LAS .
NASA Astrophysics Data System (ADS)
Cardoso, C. V.; Burningham, B.; Smith, L.; Smart, R.; Pinfield, D.; Magazzù, A.; Ghinassi, F.; Lattanzi, M.
We present the most recent results from the UKIDSS Large Area Survey (LAS) census and follow up of new T brown dwarfs in the local field. The new brown dwarf candidates are identified using optical and infrared survey photometry (UKIDSS and SDSS) and followed up with narrow band methane photometry (TNG) and spectroscopy (Gemini and Subaru) to confirm their brown dwarf nature. Employing this procedure we have discovered several dozens of new T brown dwarfs in the field. Using methane differential photometry as a proxy for spectral type for T brown dwarfs has proved to be a very efficient technique. This method can be useful in the future to reliably identify brown dwarfs in deep surveys that produce large samples of faint targets where spectroscopy is not feasible for all candidates. With this statistical robust sample of the mid and late T brown dwarf field population we were also able to address the discrepancies between the observed field space density and the expected values given the most accepted forms of the IMF of young clusters.
Lake shore and littoral habitat structure: a field survey method and its precision
Until recently, lake physical habitat assessment has been and underemployed tool for assessing lake and reservoir ecological condition. Herein, we outline and evaluate a rapid (2 persons: 1.5-3.5 h) field sampling and analytical approach for quantifying near-shore physical habit...
Practices of Cooperating Teachers Contributing to a High Quality Field Experience
ERIC Educational Resources Information Center
Lafferty, Karen Elizabeth
2015-01-01
This mixed methods study framed in cognitive apprenticeship theory involved cooperating and preservice teachers from 10 university-based credentialing programs in California. It examined the connection between cooperating teacher practices and preservice teachers' perceptions of a high quality field experience. Survey responses from 146…
Predicting the Rotor-Stator Interaction Acoustics of a Ducted Fan Engine
NASA Technical Reports Server (NTRS)
Biedron, Robert T.; Rumsey, Christopher L.; Podboy, Gary G.; Dunn, M. H.
2001-01-01
A Navier-Stokes computation is performed for a ducted-fan configuration with the goal of predicting rotor-stator noise generation without having to resort to heuristic modeling. The calculated pressure field in the inlet region is decomposed into classical infinite-duct modes, which are then used in either a hybrid finite-element/Kirchhoff surface method or boundary integral equation method to calculate the far field noise. Comparisons with experimental data are presented, including rotor wake surveys and far field sound pressure levels for two blade passage frequency (BPF) tones.
2015-06-18
Engineering Effectiveness Survey. CMU/SEI-2012-SR-009. Carnegie Mellon University. November 2012. Field, Andy. Discovering Statistics Using SPSS , 3rd...enough into the survey to begin answering questions on risk practices. All of the data statistical analysis will be performed using SPSS . Prior to...probabilistically using distributions for likelihood and impact. Statistical methods like Monte Carlo can more comprehensively evaluate the cost and
Hortness, J.E.
2004-01-01
The U.S. Geological Survey (USGS) measures discharge in streams using several methods. However, measurement of peak discharges is often impossible or impractical due to difficult access, inherent danger of making measurements during flood events, and timing often associated with flood events. Thus, many peak discharge values often are calculated after the fact by use of indirect methods. The most common indirect method for estimating peak dis- charges in streams is the slope-area method. This, like other indirect methods, requires measuring the flood profile through detailed surveys. Processing the survey data for efficient entry into computer streamflow models can be time demanding; SAM 2.1 is a program designed to expedite that process. The SAM 2.1 computer program is designed to be run in the field on a portable computer. The program processes digital surveying data obtained from an electronic surveying instrument during slope- area measurements. After all measurements have been completed, the program generates files to be input into the SAC (Slope-Area Computation program; Fulford, 1994) or HEC-RAS (Hydrologic Engineering Center-River Analysis System; Brunner, 2001) computer streamflow models so that an estimate of the peak discharge can be calculated.
Kotsis, Ioannis; Kontoes, Charalabos; Paradissis, Dimitrios; Karamitsos, Spyros; Elias, Panagiotis; Papoutsis, Ioannis
2008-06-10
The primary objective of this paper is the evaluation of the InSAR derived displacement field caused by the 07/09/1999 Athens earthquake, using as reference an external data source provided by terrestrial surveying along the Mornos river open aqueduct. To accomplish this, a processing chain to render comparable the leveling measurements and the interferometric derived measurements has been developed. The distinct steps proposed include a solution for reducing the orbital and atmospheric interferometric fringes and an innovative method to compute the actual InSAR estimated vertical ground subsidence, for direct comparison with the leveling data. Results indicate that the modeled deformation derived from a series of stacked interferograms, falls entirely within the confidence interval assessed for the terrestrial surveying data.
Kotsis, Ioannis; Kontoes, Charalabos; Paradissis, Dimitrios; Karamitsos, Spyros; Elias, Panagiotis; Papoutsis, Ioannis
2008-01-01
The primary objective of this paper is the evaluation of the InSAR derived displacement field caused by the 07/09/1999 Athens earthquake, using as reference an external data source provided by terrestrial surveying along the Mornos river open aqueduct. To accomplish this, a processing chain to render comparable the leveling measurements and the interferometric derived measurements has been developed. The distinct steps proposed include a solution for reducing the orbital and atmospheric interferometric fringes and an innovative method to compute the actual InSAR estimated vertical ground subsidence, for direct comparison with the leveling data. Results indicate that the modeled deformation derived from a series of stacked interferograms, falls entirely within the confidence interval assessed for the terrestrial surveying data. PMID:27879926
Zone of Avoidance Tully-Fisher Survey
NASA Astrophysics Data System (ADS)
Williams, Wendy; Woudt, Patrick; Kraan-Korteweg, Renee
2009-10-01
We propose to use the Parkes telescope to obtain narrowband HI spectra of a sample of galaxies in the Galactic Zone of Avoidance (ZOA). These observations, combined with high-quality near infrared photometry, will provide both the uniform coverage and accurate distance determinations (via the Tully-Fisher relation) required to map the peculiar velocity flow fields in the ZOA. The mass distribution in this region has a significant effect on the motion of the Local Group. Dynamically important structures, including the Great Attractor and the Local Void, are partially hidden behind our Galaxy. Even the most recent systematic all-sky surveys, such as the 2MASS Redshift Survey (2MRS; Huchra et al. 2005), undersample the ZOA due to stellar crowding and high dust extinction. While statistical reconstruction methods have been used to extrapolate the density field in the ZOA, they are unlikely to truely re?ect the velocity field (Loeb & Narayan 2008). Our project aims for the ?rst time to directly determine the velocity flow fields in this part of the sky. Our sample is taken from the Parkes HIZOA survey (Henning et al. 2005) and is unbiased with respect to extinction and star density.
Methods for collection and analysis of water samples
Rainwater, Frank Hays; Thatcher, Leland Lincoln
1960-01-01
This manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze water samples. Throughout, the emphasis is on obtaining analytical results that accurately describe the chemical composition of the water in situ. Among the topics discussed are selection of sampling sites, frequency of sampling, field equipment, preservatives and fixatives, analytical techniques of water analysis, and instruments. Seventy-seven laboratory and field procedures are given for determining fifty-three water properties.
Offline Arabic handwriting recognition: a survey.
Lorigo, Liana M; Govindaraju, Venu
2006-05-01
The automatic recognition of text on scanned images has enabled many applications such as searching for words in large volumes of documents, automatic sorting of postal mail, and convenient editing of previously printed documents. The domain of handwriting in the Arabic script presents unique technical challenges and has been addressed more recently than other domains. Many different methods have been proposed and applied to various types of images. This paper provides a comprehensive review of these methods. It is the first survey to focus on Arabic handwriting recognition and the first Arabic character recognition survey to provide recognition rates and descriptions of test data for the approaches discussed. It includes background on the field, discussion of the methods, and future research directions.
ERIC Educational Resources Information Center
Pratt, Daniel J.; Wine, Jennifer S.; Heuer, Ruth E.; Whitmore, Roy W.; Kelly, Janice E.; Doherty, John M.; Simpson, Joe B.; Marti, Norma
This report describes the methods and procedures used for the field test of the Beginning Postsecondary Students Longitudinal Study First Followup 1996-98 (BPS:96/98). Students in this survey were first interviewed during 1995 as part of the National Postsecondary Student Aid Study 1996 field test. The BPS:96/98 full-scale student sample includes…
This manuscript describes a novel statistical analysis technique developed by the authors for use in combining survey data carried out under different field protocols. We apply the technique to 83 years of survey data on avian songbird populations in northern lower Michigan to de...
Modification of Point Counts for Surveying Cropland Birds
Kathryn Freemark; Catherine Rogers
1995-01-01
As part of a comparative study of agricultural impacts on wildlife, modifications to the point count method were evaluated for surveying birds in, and adjacent to, cropland during the breeding season (May to early July) in Ontario. Location in the field, observation direction and distance, number of visits, and number of study sites per farm were examined using point...
ERIC Educational Resources Information Center
St. Louis, Kenneth O.
2011-01-01
Purpose: The "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was developed to make available worldwide a standard measure of public attitudes toward stuttering that is practical, reliable, valid, and translatable. Mean data from past field studies as comparisons for interpretation of "POSHA-S" results are reported. Method: Means…
NASA Astrophysics Data System (ADS)
Yahil, Amos; Strauss, Michael A.; Davis, Marc; Huchra, John P.
1991-11-01
In the paper, "A Redshift Survey of IRAS Galaxies. II. Methods for Determining Self-consistent Velocity and Density Fields" by Amos Yahil, Michael A. Strauss, Marc Davis, and John P. Huchra (ApJ, 372,380 [1991]), Figures 14 and 15 were presented out of order, with their legends reversed. Thus, the figure at the bottom of page 391 is Figure 15, and should have the legend: "Fig. 15.-As in Fig. 13, for the method 3 results." The figure at the top of page 392 is Figure 14, and should have the legend: "Fig. 14.-Plot in Galactic coordinates of the quantity V_diff_ for galaxies within 3000 km s^-1^ of the LG. The symbol size is proportional to V_diff_ - 400 km s^-1^, which measures the deviation of the redshift- distance relation along the line of sight to that galaxy from pure Hubble flow."
A survey of visual preprocessing and shape representation techniques
NASA Technical Reports Server (NTRS)
Olshausen, Bruno A.
1988-01-01
Many recent theories and methods proposed for visual preprocessing and shape representation are summarized. The survey brings together research from the fields of biology, psychology, computer science, electrical engineering, and most recently, neural networks. It was motivated by the need to preprocess images for a sparse distributed memory (SDM), but the techniques presented may also prove useful for applying other associative memories to visual pattern recognition. The material of this survey is divided into three sections: an overview of biological visual processing; methods of preprocessing (extracting parts of shape, texture, motion, and depth); and shape representation and recognition (form invariance, primitives and structural descriptions, and theories of attention).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okura, Yuki; Petri, Andrea; May, Morgan
Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less
Okura, Yuki; Petri, Andrea; May, Morgan; ...
2016-06-27
Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less
NASA Astrophysics Data System (ADS)
Wagner, Bianca; Leiss, Bernd
2017-04-01
For some years now, an extreme rise in the development and the application of new surveying techniques is taking place in the Geosciences. Hence, the traditional field work has been altered massively by e.g. terrestrial Laserscanning, Unmanned Aerial Vehicles (UAVs), hyperspectral mapping or Structure-from-Motion (SfM). The next impetus for innovation is the demonstration and analysis of the digital models by means of Virtual Reality (VR) or Augmented Reality (AR). On the market, there are a lot of new field tools and devices, numerous free or commercial software packages as well as diverse solutions for visualization. Therefore and because of the attracting affordability and the ease of learning of some methods, the number of users is increasing permanently. However, what are the real scientific outcomes? Which methods make really sense compared to traditional field work and can be incorporated in everyday processes or teaching? Which standards does the community have and need? Where will be the challenges and trends in the upcoming years? Which accuracy and resolution do we need? What are the requirements in terms of sustainable (open) data management, presentation and advanced analysis methods of such data formats? Our contribution presents some answers as well as impulses to stimulate the discussion in the 3D survey and modeling community.
Comparison of acoustic recorders and field observers for monitoring tundra bird communities
Vold, Skyler T.; Handel, Colleen M.; McNew, Lance B.
2017-01-01
Acoustic recorders can be useful for studying bird populations but their efficiency and accuracy should be assessed in pertinent ecological settings before use. We investigated the utility of an acoustic recorder for monitoring abundance of tundra‐breeding birds relative to point‐count surveys in northwestern Alaska, USA, during 2014. Our objectives were to 1) compare numbers of birds and species detected by a field observer with those detected simultaneously by an acoustic recorder; 2) evaluate how detection probabilities for the observer and acoustic recorder varied with distance of birds from the survey point; and 3) evaluate whether avian guild‐specific detection rates differed between field observers and acoustic recorders relative to habitat. Compared with the observer, the acoustic recorder detected fewer species (βMethod = −0.39 ± 0.07) and fewer individuals (βMethod = −0.56 ± 0.05) in total and for 6 avian guilds. Discrepancies were attributed primarily to differences in effective area surveyed (91% missed by device were >100 m), but also to nonvocal birds being missed by the recorder (55% missed <100 m were silent). The observer missed a few individuals and one species detected by the device. Models indicated that relative abundance of various avian guilds was associated primarily with maximum shrub height and less so with shrub cover and visual obstruction. The absence of a significant interaction between survey method (observer vs. acoustic recorder) and any habitat characteristic suggests that traditional point counts and acoustic recorders would yield similar inferences about ecological relationships in tundra ecosystems. Pairing of the 2 methods could increase survey efficiency and allow for validation and archival of survey results.
NASA Astrophysics Data System (ADS)
Shoushtari, M. A.; Sadeghi-Niaraki, H.
2014-10-01
The growing trend in technological advances and Micro Electro Mechanical Systems (MEMS) has targeted for intelligent human lives. Accordingly, Ubiquitous Computing Approach was proposed by Mark Weiser. This paper proposes an ubiquitous surveying solution in Geometrics and surveying field. Ubiquitous Surveying provides cost-effective, smart and available surveying techniques while traditional surveying equipment are so expensive and have small availability specially in indoor and daily surveying jobs. In order to have a smart surveying instrument, different information technology methods and tools like Triangle method, Received Signal Strength Indicator (RSSI) method and laser sensor are used. These new ways in combine with surveying equations introduces a modern surveying equipment called Ubi-Total Station that also employed different sensors embedded in smartphone and mobile stand. RSSI-based localization and Triangle method technique are easy and well known methods to predict the position of an unknown node in indoor environments whereas additional measures are required for a sufficient accuracy. In this paper the main goal is to introduce the Ubiquitous Total Station as a development in smart and ubiquitous GIS. In order to public use of the surveying equipment, design and implementation of this instrument has been done. Conceptual model of Smartphone-based system is designed for this study and based on this model, an Android application as a first sample is developed. Finally the evaluations shows that absolute errors in X and Y calculation are 0.028 and 0.057 meter respectively. Also RMSE of 0.26 was calculated in RSSI method for distance measurement. The high price of traditional equipment and their requirement for professional surveyors has given way to intelligent surveying. In the suggested system, smartphones can be used as tools for positioning and coordinating geometric information of objects.
NASA Astrophysics Data System (ADS)
Castillo, Carlos; Pérez, Rafael
2017-04-01
The assessment of gully erosion volumes is essential for the quantification of soil losses derived from this relevant degradation process. Traditionally, 2D and 3D approaches has been applied for this purpose (Casalí et al., 2006). Although innovative 3D approaches have recently been proposed for gully volume quantification, a renewed interest can be found in literature regarding the useful information that cross-section analysis still provides in gully erosion research. Moreover, the application of methods based on 2D approaches can be the most cost-effective approach in many situations such as preliminary studies with low accuracy requirements or surveys under time or budget constraints. The main aim of this work is to examine the key factors controlling volume error variability in 2D gully assessment by means of a stochastic experiment involving a Monte Carlo analysis over synthetic gully profiles in order to 1) contribute to a better understanding of the drivers and magnitude of gully erosion 2D-surveys uncertainty and 2) provide guidelines for optimal survey designs. Owing to the stochastic properties of error generation in 2D volume assessment, a statistical approach was followed to generate a large and significant set of gully reach configurations to evaluate quantitatively the influence of the main factors controlling the uncertainty of the volume assessment. For this purpose, a simulation algorithm in Matlab® code was written, involving the following stages: - Generation of synthetic gully area profiles with different degrees of complexity (characterized by the cross-section variability) - Simulation of field measurements characterised by a survey intensity and the precision of the measurement method - Quantification of the volume error uncertainty as a function of the key factors In this communication we will present the relationships between volume error and the studied factors and propose guidelines for 2D field surveys based on the minimal survey densities required to achieve a certain accuracy given the cross-sectional variability of a gully and the measurement method applied. References Casali, J., Loizu, J., Campo, M.A., De Santisteban, L.M., Alvarez-Mozos, J., 2006. Accuracy of methods for field assessment of rill and ephemeral gully erosion. Catena 67, 128-138. doi:10.1016/j.catena.2006.03.005
A New Method for Wide-field Near-IR Imaging with the Hubble Space Telescope
NASA Astrophysics Data System (ADS)
Momcheva, Ivelina G.; van Dokkum, Pieter G.; van der Wel, Arjen; Brammer, Gabriel B.; MacKenty, John; Nelson, Erica J.; Leja, Joel; Muzzin, Adam; Franx, Marijn
2017-01-01
We present a new technique for wide and shallow observations using the near-infrared channel of Wide Field Camera 3 (WFC3) on the Hubble Space Telescope (HST). Wide-field near-IR surveys with HST are generally inefficient, as guide star acquisitions make it impractical to observe more than one pointing per orbit. This limitation can be circumvented by guiding with gyros alone, which is possible as long as the telescope has three functional gyros. The method presented here allows us to observe mosaics of eight independent WFC3-IR pointings in a single orbit by utilizing the fact that HST drifts by only a very small amount in the 25 s between non-destructive reads of unguided exposures. By shifting the reads and treating them as independent exposures the full resolution of WFC3 can be restored. We use this “drift and shift” (DASH) method in the Cycle 23 COSMOS-DASH program, which will obtain 456 WFC3 H 160 pointings in 57 orbits, covering an area of 0.6 degree in the COSMOS field down to H 160 = 25. When completed, the program will more than triple the area of extra-galactic survey fields covered by near-IR imaging at HST resolution. We demonstrate the viability of the method with the first four orbits (32 pointings) of this program. We show that the resolution of the WFC3 camera is preserved, and that structural parameters of galaxies are consistent with those measured in guided observations.
Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.
ERIC Educational Resources Information Center
Riesenberg, Lou E.; Gor, Christopher Obel
1989-01-01
Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…
NASA Astrophysics Data System (ADS)
Price-Whelan, Adrian M.; Agueros, M. A.; Fournier, A.; Street, R.; Ofek, E.; Levitan, D. B.; PTF Collaboration
2013-01-01
Many current photometric, time-domain surveys are driven by specific goals such as searches for supernovae or transiting exoplanets, or studies of stellar variability. These goals in turn set the cadence with which individual fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several such sub-surveys are being conducted in parallel, leading to extremely non-uniform sampling over the survey's nearly 20,000 sq. deg. footprint. While the typical 7.26 sq. deg. PTF field has been imaged 20 times in R-band, ~2300 sq. deg. have been observed more than 100 times. We use the existing PTF data 6.4x107 light curves) to study the trade-off that occurs when searching for microlensing events when one has access to a large survey footprint with irregular sampling. To examine the probability that microlensing events can be recovered in these data, we also test previous statistics used on uniformly sampled data to identify variables and transients. We find that one such statistic, the von Neumann ratio, performs best for identifying simulated microlensing events. We develop a selection method using this statistic and apply it to data from all PTF fields with >100 observations to uncover a number of interesting candidate events. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large datasets, both of which will be useful to future wide-field, time-domain surveys such as the LSST.
NASA Technical Reports Server (NTRS)
Mclaughlin, W. I.; Lundy, S. A.; Ling, H. Y.; Stroberg, M. W.
1980-01-01
The coverage of the celestial sphere or the surface of the earth with a narrow-field instrument onboard a satellite can be described by a set of swaths on the sphere. A transect is a curve on this sphere constructed to sample the coverage. At each point on the transect the number of times that the field-of-view of the instrument has passed over the point is recorded. This information is conveniently displayed as an integer-valued histogram over the length of the transect. The effectiveness of the transect method for a particular observing plan and the best placement of the transects depends upon the structure of the set of observations. Survey missions are usually characterized by a somewhat parallel alignment of the instrument swaths. Using autocorrelation and cross-correlation functions among the histograms the structure of a survey has been analyzed into two components, and each is illustrated by a simple mathematical model. The complex, all-sky survey to be performed by the Infrared Astronomical Satellite (IRAS) is synthesized in some detail utilizing the objectives and constraints of that mission. It is seen that this survey possesses the components predicted by the simple models and this information is useful in characterizing the properties of the IRAS survey and the placement of the transects as a function of celestial latitude and certain structural properties of the coverage.
Mapping forest canopy gaps using air-photo interpretation and ground surveys
Fox, T.J.; Knutson, M.G.; Hines, R.K.
2000-01-01
Canopy gaps are important structural components of forested habitats for many wildlife species. Recent improvements in the spatial accuracy of geographic information system tools facilitate accurate mapping of small canopy features such as gaps. We compared canopy-gap maps generated using ground survey methods with those derived from air-photo interpretation. We found that maps created from high-resolution air photos were more accurate than those created from ground surveys. Errors of omission were 25.6% for the ground-survey method and 4.7% for the air-photo method. One variable of inter est in songbird research is the distance from nests to gap edges. Distances from real and simulated nests to gap edges were longer using the ground-survey maps versus the air-photo maps, indicating that gap omission could potentially bias the assessment of spatial relationships. If research or management goals require location and size of canopy gaps and specific information about vegetation structure, we recommend a 2-fold approach. First, canopy gaps can be located and the perimeters defined using 1:15,000-scale or larger aerial photographs and the methods we describe. Mapped gaps can then be field-surveyed to obtain detailed vegetation data.
NASA Astrophysics Data System (ADS)
Lanusse, F.; Rassat, A.; Starck, J.-L.
2015-06-01
Context. Upcoming spectroscopic galaxy surveys are extremely promising to help in addressing the major challenges of cosmology, in particular in understanding the nature of the dark universe. The strength of these surveys, naturally described in spherical geometry, comes from their unprecedented depth and width, but an optimal extraction of their three-dimensional information is of utmost importance to best constrain the properties of the dark universe. Aims: Although there is theoretical motivation and novel tools to explore these surveys using the 3D spherical Fourier-Bessel (SFB) power spectrum of galaxy number counts Cℓ(k,k'), most survey optimisations and forecasts are based on the tomographic spherical harmonics power spectrum C(ij)_ℓ. The goal of this paper is to perform a new investigation of the information that can be extracted from these two analyses in the context of planned stage IV wide-field galaxy surveys. Methods: We compared tomographic and 3D SFB techniques by comparing the forecast cosmological parameter constraints obtained from a Fisher analysis. The comparison was made possible by careful and coherent treatment of non-linear scales in the two analyses, which makes this study the first to compare 3D SFB and tomographic constraints on an equal footing. Nuisance parameters related to a scale- and redshift-dependent galaxy bias were also included in the computation of the 3D SFB and tomographic power spectra for the first time. Results: Tomographic and 3D SFB methods can recover similar constraints in the absence of systematics. This requires choosing an optimal number of redshift bins for the tomographic analysis, which we computed to be N = 26 for zmed ≃ 0.4, N = 30 for zmed ≃ 1.0, and N = 42 for zmed ≃ 1.7. When marginalising over nuisance parameters related to the galaxy bias, the forecast 3D SFB constraints are less affected by this source of systematics than the tomographic constraints. In addition, the rate of increase of the figure of merit as a function of median redshift is higher for the 3D SFB method than for the 2D tomographic method. Conclusions: Constraints from the 3D SFB analysis are less sensitive to unavoidable systematics stemming from a redshift- and scale-dependent galaxy bias. Even for surveys that are optimised with tomography in mind, a 3D SFB analysis is more powerful. In addition, for survey optimisation, the figure of merit for the 3D SFB method increases more rapidly with redshift, especially at higher redshifts, suggesting that the 3D SFB method should be preferred for designing and analysing future wide-field spectroscopic surveys. CosmicPy, the Python package developed for this paper, is freely available at https://cosmicpy.github.io. Appendices are available in electronic form at http://www.aanda.org
Modeling the Sensitivity of Field Surveys for Detection of Environmental DNA (eDNA)
Schultz, Martin T.; Lance, Richard F.
2015-01-01
The environmental DNA (eDNA) method is the practice of collecting environmental samples and analyzing them for the presence of a genetic marker specific to a target species. Little is known about the sensitivity of the eDNA method. Sensitivity is the probability that the target marker will be detected if it is present in the water body. Methods and tools are needed to assess the sensitivity of sampling protocols, design eDNA surveys, and interpret survey results. In this study, the sensitivity of the eDNA method is modeled as a function of ambient target marker concentration. The model accounts for five steps of sample collection and analysis, including: 1) collection of a filtered water sample from the source; 2) extraction of DNA from the filter and isolation in a purified elution; 3) removal of aliquots from the elution for use in the polymerase chain reaction (PCR) assay; 4) PCR; and 5) genetic sequencing. The model is applicable to any target species. For demonstration purposes, the model is parameterized for bighead carp (Hypophthalmichthys nobilis) and silver carp (H. molitrix) assuming sampling protocols used in the Chicago Area Waterway System (CAWS). Simulation results show that eDNA surveys have a high false negative rate at low concentrations of the genetic marker. This is attributed to processing of water samples and division of the extraction elution in preparation for the PCR assay. Increases in field survey sensitivity can be achieved by increasing sample volume, sample number, and PCR replicates. Increasing sample volume yields the greatest increase in sensitivity. It is recommended that investigators estimate and communicate the sensitivity of eDNA surveys to help facilitate interpretation of eDNA survey results. In the absence of such information, it is difficult to evaluate the results of surveys in which no water samples test positive for the target marker. It is also recommended that invasive species managers articulate concentration-based sensitivity objectives for eDNA surveys. In the absence of such information, it is difficult to design appropriate sampling protocols. The model provides insights into how sampling protocols can be designed or modified to achieve these sensitivity objectives. PMID:26509674
Modeling the Sensitivity of Field Surveys for Detection of Environmental DNA (eDNA).
Schultz, Martin T; Lance, Richard F
2015-01-01
The environmental DNA (eDNA) method is the practice of collecting environmental samples and analyzing them for the presence of a genetic marker specific to a target species. Little is known about the sensitivity of the eDNA method. Sensitivity is the probability that the target marker will be detected if it is present in the water body. Methods and tools are needed to assess the sensitivity of sampling protocols, design eDNA surveys, and interpret survey results. In this study, the sensitivity of the eDNA method is modeled as a function of ambient target marker concentration. The model accounts for five steps of sample collection and analysis, including: 1) collection of a filtered water sample from the source; 2) extraction of DNA from the filter and isolation in a purified elution; 3) removal of aliquots from the elution for use in the polymerase chain reaction (PCR) assay; 4) PCR; and 5) genetic sequencing. The model is applicable to any target species. For demonstration purposes, the model is parameterized for bighead carp (Hypophthalmichthys nobilis) and silver carp (H. molitrix) assuming sampling protocols used in the Chicago Area Waterway System (CAWS). Simulation results show that eDNA surveys have a high false negative rate at low concentrations of the genetic marker. This is attributed to processing of water samples and division of the extraction elution in preparation for the PCR assay. Increases in field survey sensitivity can be achieved by increasing sample volume, sample number, and PCR replicates. Increasing sample volume yields the greatest increase in sensitivity. It is recommended that investigators estimate and communicate the sensitivity of eDNA surveys to help facilitate interpretation of eDNA survey results. In the absence of such information, it is difficult to evaluate the results of surveys in which no water samples test positive for the target marker. It is also recommended that invasive species managers articulate concentration-based sensitivity objectives for eDNA surveys. In the absence of such information, it is difficult to design appropriate sampling protocols. The model provides insights into how sampling protocols can be designed or modified to achieve these sensitivity objectives.
Seismoelectric imaging of shallow targets
Haines, S.S.; Pride, S.R.; Klemperer, S.L.; Biondi, B.
2007-01-01
We have undertaken a series of controlled field experiments to develop seismoelectric experimental methods for near-surface applications and to improve our understanding of seismoelectric phenomena. In a set of off-line geometry surveys (source separated from the receiver line), we place seismic sources and electrode array receivers on opposite sides of a man-made target (two sand-filled trenches) to record separately two previously documented seismoelectric modes: (1) the electromagnetic interface response signal created at the target and (2) the coseismic electric fields located within a compressional seismic wave. With the seismic source point in the center of a linear electrode array, we identify the previously undocumented seismoelectric direct field, and the Lorentz field of the metal hammer plate moving in the earth's magnetic field. We place the seismic source in the center of a circular array of electrodes (radial and circumferential orientations) to analyze the source-related direct and Lorentz fields and to establish that these fields can be understood in terms of simple analytical models. Using an off-line geometry, we create a multifold, 2D image of our trenches as dipping layers, and we also produce a complementary synthetic image through numerical modeling. These images demonstrate that off-line geometry (e.g., crosswell) surveys offer a particularly promising application of the seismoelectric method because they effectively separate the interface response signal from the (generally much stronger) coseismic and source-related fields. ?? 2007 Society of Exploration Geophysicists.
A Study of Quasar Selection in the Supernova Fields of the Dark Energy Survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tie, S. S.; Martini, P.; Mudd, D.
In this paper, we present a study of quasar selection using the supernova fields of the Dark Energy Survey (DES). We used a quasar catalog from an overlapping portion of the SDSS Stripe 82 region to quantify the completeness and efficiency of selection methods involving color, probabilistic modeling, variability, and combinations of color/probabilistic modeling with variability. In all cases, we considered only objects that appear as point sources in the DES images. We examine color selection methods based on the Wide-field Infrared Survey Explorer (WISE) mid-IR W1-W2 color, a mixture of WISE and DES colors (g - i and i-W1),more » and a mixture of Vista Hemisphere Survey and DES colors (g - i and i - K). For probabilistic quasar selection, we used XDQSO, an algorithm that employs an empirical multi-wavelength flux model of quasars to assign quasar probabilities. Our variability selection uses the multi-band χ 2-probability that sources are constant in the DES Year 1 griz-band light curves. The completeness and efficiency are calculated relative to an underlying sample of point sources that are detected in the required selection bands and pass our data quality and photometric error cuts. We conduct our analyses at two magnitude limits, i < 19.8 mag and i < 22 mag. For the subset of sources with W1 and W2 detections, the W1-W2 color or XDQSOz method combined with variability gives the highest completenesses of >85% for both i-band magnitude limits and efficiencies of >80% to the bright limit and >60% to the faint limit; however, the giW1 and giW1+variability methods give the highest quasar surface densities. The XDQSOz method and combinations of W1W2/giW1/XDQSOz with variability are among the better selection methods when both high completeness and high efficiency are desired. We also present the OzDES Quasar Catalog of 1263 spectroscopically confirmed quasars from three years of OzDES observation in the 30 deg 2 of the DES supernova fields. Finally, the catalog includes quasars with redshifts up to z ~ 4 and brighter than i = 22 mag, although the catalog is not complete up to this magnitude limit.« less
A Study of Quasar Selection in the Supernova Fields of the Dark Energy Survey
Tie, S. S.; Martini, P.; Mudd, D.; ...
2017-02-15
In this paper, we present a study of quasar selection using the supernova fields of the Dark Energy Survey (DES). We used a quasar catalog from an overlapping portion of the SDSS Stripe 82 region to quantify the completeness and efficiency of selection methods involving color, probabilistic modeling, variability, and combinations of color/probabilistic modeling with variability. In all cases, we considered only objects that appear as point sources in the DES images. We examine color selection methods based on the Wide-field Infrared Survey Explorer (WISE) mid-IR W1-W2 color, a mixture of WISE and DES colors (g - i and i-W1),more » and a mixture of Vista Hemisphere Survey and DES colors (g - i and i - K). For probabilistic quasar selection, we used XDQSO, an algorithm that employs an empirical multi-wavelength flux model of quasars to assign quasar probabilities. Our variability selection uses the multi-band χ 2-probability that sources are constant in the DES Year 1 griz-band light curves. The completeness and efficiency are calculated relative to an underlying sample of point sources that are detected in the required selection bands and pass our data quality and photometric error cuts. We conduct our analyses at two magnitude limits, i < 19.8 mag and i < 22 mag. For the subset of sources with W1 and W2 detections, the W1-W2 color or XDQSOz method combined with variability gives the highest completenesses of >85% for both i-band magnitude limits and efficiencies of >80% to the bright limit and >60% to the faint limit; however, the giW1 and giW1+variability methods give the highest quasar surface densities. The XDQSOz method and combinations of W1W2/giW1/XDQSOz with variability are among the better selection methods when both high completeness and high efficiency are desired. We also present the OzDES Quasar Catalog of 1263 spectroscopically confirmed quasars from three years of OzDES observation in the 30 deg 2 of the DES supernova fields. Finally, the catalog includes quasars with redshifts up to z ~ 4 and brighter than i = 22 mag, although the catalog is not complete up to this magnitude limit.« less
About well-posed definition of geophysical fields'
NASA Astrophysics Data System (ADS)
Ermokhine, Konstantin; Zhdanova, Ludmila; Litvinova, Tamara
2013-04-01
We introduce a new approach to the downward continuation of geophysical fields based on approximation of observed data by continued fractions. Key Words: downward continuation, continued fraction, Viskovatov's algorithm. Many papers in geophysics are devoted to the downward continuation of geophysical fields from the earth surface to the lower halfspace. Known obstacle for the method practical use is a field's breaking-down phenomenon near the pole closest to the earth surface. It is explained by the discrepancy of the studied fields' mathematical description: linear presentation of the field in the polynomial form, Taylor or Fourier series, leads to essential and unremovable instability of the inverse problem since the field with specific features in the form of poles in the lower halfspace principally can't be adequately described by the linear construction. Field description by the rational fractions is closer to reality. In this case the presence of function's poles in the lower halfspace corresponds adequately to the denominator zeros. Method proposed below is based on the continued fractions. Let's consider the function measured along the profile and represented it in the form of the Tchebishev series (preliminary reducing the argument to the interval [-1, 1]): There are many variants of power series' presentation by continued fractions. The areas of series and corresponding continued fraction's convergence may differ essentially. As investigations have shown, the most suitable mathematical construction for geophysical fields' continuation is so called general C-fraction: where ( , z designates the depth) For construction of C-fraction corresponding to power series exists a rather effective and stable Viskovatov's algorithm (Viskovatov B. "De la methode generale pour reduire toutes sortes des quantitees en fraction continues". Memoires de l' Academie Imperiale des Sciences de St. Petersburg, 1, 1805). A fundamentally new algorithm for Downward Continuation (in an underground half-space) a field measured at the surface, allows you to make the interpretation of geophysical data, to build a cross-section, determine the depth, the approximate shape and size of the sources measured at the surface of the geophysical fields. Appliance of the method are any geophysical surveys: magnetic, gravimetric, electrical exploration, seismic, geochemical surveying, etc. Method was tested on model examples, and practical data. The results are confirmed by drilling.
Wide-Field InfraRed Survey Telescope (WFIRST) Slitless Spectrometer: Design, Prototype, and Results
NASA Technical Reports Server (NTRS)
Gong, Qian; Content, David; Dominguez, Margaret; Emmett, Thomas; Griesmann, Ulf; Hagopian, John; Kruk, Jeffrey; Marx, Catherine; Pasquale, Bert; Wallace, Thomas;
2016-01-01
The slitless spectrometer plays an important role in the Wide-Field InfraRed Survey Telescope (WFIRST) mission for the survey of emission-line galaxies. This will be an unprecedented very wide field, HST quality 3D survey of emission line galaxies. The concept of the compound grism as a slitless spectrometer has been presented previously. The presentation briefly discusses the challenges and solutions of the optical design, and recent specification updates, as well as a brief comparison between the prototype and the latest design. However, the emphasis of this paper is the progress of the grism prototype: the fabrication and test of the complicated diffractive optical elements and powered prism, as well as grism assembly alignment and testing. Especially how to use different tools and methods, such as IR phase shift and wavelength shift interferometry, to complete the element and assembly tests. The paper also presents very encouraging results from recent element tests to assembly tests. Finally we briefly touch the path forward plan to test the spectral characteristic, such as spectral resolution and response.
Gulf Coast Disaster Management: Forest Damage Detection and Carbon Flux Estimation
NASA Astrophysics Data System (ADS)
Maki, A. E.; Childs, L. M.; Jones, J.; Matthews, C.; Spindel, D.; Batina, M.; Malik, S.; Allain, M.; Brooks, A. O.; Brozen, M.; Chappell, C.; Frey, J. W.
2008-12-01
Along the Gulf Coast and Eastern Seaboard, tropical storms and hurricanes annually cause defoliation and deforestation amongst coastal forests. After a severe storm clears, there is an urgent need to assess impacts on timber resources for targeting state and national resources to assist in recovery. It is important to identify damaged areas following the storm, due to their increased probability of fire risk, as well as the effect upon the carbon budget. Better understanding and management of the immediate and future effects on the carbon cycle in the coastal forest ecosystem is especially important. Current methods of detection involve assessment through ground-based field surveys, aerial surveys, computer modeling of meteorological data, space-borne remote sensing, and Forest Inventory and Analysis field plots. Introducing remotely-sensed data from NASA and NASA-partnered Earth Observation Systems (EOS), this project seeks to improve the current methodology and focuses on a need for methods that are more synoptic than field surveys and more closely linked to the phenomenology of tree loss and damage than passive remote sensing methods. The primary concentration is on the utilization of Ice, Cloud, and land Elevation Satellite (ICESat) Geoscience Laser Altimeter System (GLAS) data products to detect changes in forest canopy height as an indicator of post-hurricane forest disturbances. By analyzing ICESat data over areas affected by Hurricane Katrina, this study shows that ICESsat is a useful method of detecting canopy height change, though further research is needed in mixed forest areas. Other EOS utilized in this study include Landsat, Moderate Resolution Imaging Spectroradiometer (MODIS), and the NASA verified and validated international Advanced Wide Field Sensor (AWiFS) sensor. This study addresses how NASA could apply ICESat data to contribute to an improved method of detecting hurricane-caused forest damage in coastal areas; thus to pinpoint areas more susceptible to fire damage and subsequent loss of carbon sequestration.
Profile of Pre-Service Science Teachers Based on STEM Career Interest Survey
NASA Astrophysics Data System (ADS)
Winarno, N.; Widodo, A.; Rusdiana, D.; Rochintaniawati, D.; Afifah, R. M. A.
2017-09-01
This study aims to investigate the profile of pre-service science teachers based on STEM (Science, Technology, Engineering, and Mathematics) Career Interest Survey. The study uses descriptive survey method as the research design. Samples collected from 66 preservice science teachers in a university located in Bandung, Indonesia. The results of the study are the profile of pre-service science teachers based on STEM Career Interest Survey shows that the average number of career interest in the field of technology is 4.08, in science 3.80, mathematics 3.39 and engineering 3.30. Pre-service science teachers are found to have interests in the STEM career fields. This research is necessary as there are many instances of people choosing majors or studies that are not in accordance with their interests and talents. The recommendation of this study is to develop learning in pre-service science teachers by using STEM approach.
ERIC Educational Resources Information Center
Ely, Mindy S.; Ostrosky, Michaelene M.
2017-01-01
Introduction: Professionals working with infants and toddlers with visual impairments (that is, those who are blind or have low vision) were surveyed regarding their preservice training and their awareness and use of 29 resources related to young children who are visually impaired. Methods: Early intervention visual impairment professionals (n =…
Research on Mail Surveys: Response Rates and Methods in Relation to Population Group and Time.
ERIC Educational Resources Information Center
Boser, Judith A.; Green, Kathy
The purpose of this review was to look for trends across time in response rates and variables studied for published mail surveys and to compare response rates and variables studied for different target populations. Studies were identified in databases in four fields: education, psychology, business and marketing, and sociology. A total of 225…
Software for improved field surveys of nesting marine turtles.
Anastácio, R; Gonzalez, J M; Slater, K; Pereira, M J
2017-09-07
Field data are still recorded on paper in many worldwide beach surveys of nesting marine turtles. The data must be subsequently transferred into an electronic database, and this can introduce errors in the dataset. To minimize such errors, the "Turtles" software was developed and piloted to record field data by one software user accompanying one Tortuguero in Akumal beaches, Quintana Roo, Mexico, from June 1 st to July 31 st during the night patrols. Comparisons were made between exported data from the software with the paper forms entered into a database (henceforth traditional). Preliminary assessment indicated that the software user tended to record a greater amount of metrics (i.e., an average of 18.3 fields ± 5.4 sd vs. 8.6 fields ± 2.1 sd recorded by the traditional method). The traditional method introduce three types of "errors" into a dataset: missing values in relevant fields (40.1%), different answers for the same value (9.8%), and inconsistent data (0.9%). Only 5.8% of these (missing values) were found with the software methodology. Although only tested by a single user, the software may suggest increased efficacy and warrants further examination to accurately assess the merit of replacing traditional methods of data recording for beach monitoring programmes.
2015-01-01
The mission of the Water Resources Discipline of the U.S. Geological Survey (USGS) is to provide the information and understanding needed for wise management of the Nation's water resources. Inherent in this mission is the responsibility to collect data that accurately describe the physical, chemical, and biological attributes of water systems. These data are used for environmental and resource assessments by the USGS, other government agenices and scientific organizations, and the general public. Reliable and quality-assured data are essential to the credibility and impartiality of the water-resources appraisals carried out by the USGS. The development and use of a National Field Manual is necessary to achieve consistency in the scientific methods and procedures used, to document those methods and procedures, and to maintain technical expertise. USGS field personnel use this manual to ensure that the data collected are of the quality required to fulfill our mission.
Selecting AGN through Variability in SN Datasets
NASA Astrophysics Data System (ADS)
Boutsia, K.; Leibundgut, B.; Trevese, D.; Vagnetti, F.
2010-07-01
Variability is a main property of Active Galactic Nuclei (AGN) and it was adopted as a selection criterion using multi epoch surveys conducted for the detection of supernovae (SNe). We have used two SN datasets. First we selected the AXAF field of the STRESS project, centered in the Chandra Deep Field South where, besides the deep X-ray surveys also various optical catalogs exist. Our method yielded 132 variable AGN candidates. We then extended our method including the dataset of the ESSENCE project that has been active for 6 years, producing high quality light curves in the R and I bands. We obtained a sample of ˜4800 variable sources, down to R=22, in the whole 12 deg2 ESSENCE field. Among them, a subsample of ˜500 high priority AGN candidates was created using as secondary criterion the shape of the structure function. In a pilot spectroscopic run we have confirmed the AGN nature for nearly all of our candidates.
NASA Astrophysics Data System (ADS)
Scott, K. S.; Yun, M. S.; Wilson, G. W.; Austermann, J. E.; Aguilar, E.; Aretxaga, I.; Ezawa, H.; Ferrusca, D.; Hatsukade, B.; Hughes, D. H.; Iono, D.; Giavalisco, M.; Kawabe, R.; Kohno, K.; Mauskopf, P. D.; Oshima, T.; Perera, T. A.; Rand, J.; Tamura, Y.; Tosaki, T.; Velazquez, M.; Williams, C. C.; Zeballos, M.
2010-07-01
We present the first results from a confusion-limited map of the Great Observatories Origins Deep Survey-South (GOODS-S) taken with the AzTEC camera on the Atacama Submillimeter Telescope Experiment. We imaged a field to a 1σ depth of 0.48-0.73 mJybeam-1, making this one of the deepest blank-field surveys at mm-wavelengths ever achieved. Although by traditional standards our GOODS-S map is extremely confused due to a sea of faint underlying sources, we demonstrate through simulations that our source identification and number counts analyses are robust, and the techniques discussed in this paper are relevant for other deeply confused surveys. We find a total of 41 dusty starburst galaxies with signal-to-noise ratios S/N >= 3. 5 within this uniformly covered region, where only two are expected to be false detections, and an additional seven robust source candidates located in the noisier (1σ ~ 1 mJybeam-1) outer region of the map. We derive the 1.1 mm number counts from this field using two different methods: a fluctuation or ``P(d)'' analysis and a semi-Bayesian technique and find that both methods give consistent results. Our data are well fit by a Schechter function model with . Given the depth of this survey, we put the first tight constraints on the 1.1 mm number counts at S1.1mm = 0.5 mJy, and we find evidence that the faint end of the number counts at from various SCUBA surveys towards lensing clusters are biased high. In contrast to the 870μm survey of this field with the LABOCA camera, we find no apparent underdensity of sources compared to previous surveys at 1.1mm the estimates of the number counts of SMGs at flux densities >1mJy determined here are consistent with those measured from the AzTEC/SHADES survey. Additionally, we find a significant number of SMGs not identified in the LABOCA catalogue. We find that in contrast to observations at λ <= 500μm, MIPS 24μm sources do not resolve the total energy density in the cosmic infrared background at 1.1 mm, demonstrating that a population of z >~ 3 dust-obscured galaxies that are unaccounted for at these shorter wavelengths potentially contribute to a large fraction (~2/3) of the infrared background at 1.1 mm.
State of the art survey on MRI brain tumor segmentation.
Gordillo, Nelly; Montseny, Eduard; Sobrevilla, Pilar
2013-10-01
Brain tumor segmentation consists of separating the different tumor tissues (solid or active tumor, edema, and necrosis) from normal brain tissues: gray matter (GM), white matter (WM), and cerebrospinal fluid (CSF). In brain tumor studies, the existence of abnormal tissues may be easily detectable most of the time. However, accurate and reproducible segmentation and characterization of abnormalities are not straightforward. In the past, many researchers in the field of medical imaging and soft computing have made significant survey in the field of brain tumor segmentation. Both semiautomatic and fully automatic methods have been proposed. Clinical acceptance of segmentation techniques has depended on the simplicity of the segmentation, and the degree of user supervision. Interactive or semiautomatic methods are likely to remain dominant in practice for some time, especially in these applications where erroneous interpretations are unacceptable. This article presents an overview of the most relevant brain tumor segmentation methods, conducted after the acquisition of the image. Given the advantages of magnetic resonance imaging over other diagnostic imaging, this survey is focused on MRI brain tumor segmentation. Semiautomatic and fully automatic techniques are emphasized. Copyright © 2013 Elsevier Inc. All rights reserved.
Geomagnetic field observations in the Kopaonik thrust region, Yugoslavia.
NASA Astrophysics Data System (ADS)
Bicskei, T.; Popeskov, M.
1991-09-01
In the absence of continuous registrations of the geomagnetic field variations in the surveyed region, the nearest permanent observatory records had to be used in the data reduction procedure. The proposed method estimates the differences between the hourly mean values at the particular measuring site, which are not actually known, and at the observatory on the basis of a series of instantaneous total field intensity values measured simultaneously at these two places. The application of this method to the geomagnetic field data from the wider area of the Kopaonik thrust region has revealed local field changes which show connection with pronounced seismic activity that has been going on in this region since it was affected by the M = 6.0 earthquake on May 18, 1980.
A new method to search for high-redshift clusters using photometric redshifts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castignani, G.; Celotti, A.; Chiaberge, M.
2014-09-10
We describe a new method (Poisson probability method, PPM) to search for high-redshift galaxy clusters and groups by using photometric redshift information and galaxy number counts. The method relies on Poisson statistics and is primarily introduced to search for megaparsec-scale environments around a specific beacon. The PPM is tailored to both the properties of the FR I radio galaxies in the Chiaberge et al. sample, which are selected within the COSMOS survey, and to the specific data set used. We test the efficiency of our method of searching for cluster candidates against simulations. Two different approaches are adopted. (1) Wemore » use two z ∼ 1 X-ray detected cluster candidates found in the COSMOS survey and we shift them to higher redshift up to z = 2. We find that the PPM detects the cluster candidates up to z = 1.5, and it correctly estimates both the redshift and size of the two clusters. (2) We simulate spherically symmetric clusters of different size and richness, and we locate them at different redshifts (i.e., z = 1.0, 1.5, and 2.0) in the COSMOS field. We find that the PPM detects the simulated clusters within the considered redshift range with a statistical 1σ redshift accuracy of ∼0.05. The PPM is an efficient alternative method for high-redshift cluster searches that may also be applied to both present and future wide field surveys such as SDSS Stripe 82, LSST, and Euclid. Accurate photometric redshifts and a survey depth similar or better than that of COSMOS (e.g., I < 25) are required.« less
Geological Survey research 1981
,
1982-01-01
This U.S. Geological Survey activities report includes a summary of 1981 fiscal year scientific and economic results accompanied by a list of geologic, hydrologic, and cartographic investigations in progress. The summary of results includes: (1) Mineral, (2) Water resources, (3) Engineering geology and hydrology, (4) Regional geology, (5) Principles and processes, (6) Laboratory and field methods, (7) Topographic surveys and mapping, (8) Management of resources on public lands, (9) Land information and analysis, and (10) Investigations in other countries. Also included are lists of investigations in progress.
Eye Tracking and Head Movement Detection: A State-of-Art Survey
2013-01-01
Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications. It is considered a significant untraditional method of human computer interaction. Head movement detection has also received researchers' attention and interest as it has been found to be a simple and effective interaction method. Both technologies are considered the easiest alternative interface methods. They serve a wide range of severely disabled people who are left with minimal motor abilities. For both eye tracking and head movement detection, several different approaches have been proposed and used to implement different algorithms for these technologies. Despite the amount of research done on both technologies, researchers are still trying to find robust methods to use effectively in various applications. This paper presents a state-of-art survey for eye tracking and head movement detection methods proposed in the literature. Examples of different fields of applications for both technologies, such as human-computer interaction, driving assistance systems, and assistive technologies are also investigated. PMID:27170851
VizieR Online Data Catalog: Improved multi-band photometry from SERVS (Nyland+, 2017)
NASA Astrophysics Data System (ADS)
Nyland, K.; Lacy, M.; Sajina, A.; Pforr, J.; Farrah, D.; Wilson, G.; Surace, J.; Haussler, B.; Vaccari, M.; Jarvis, M.
2017-07-01
The Spitzer Extragalactic Representative Volume Survey (SERVS) sky footprint includes five well-studied astronomical deep fields with abundant multi-wavelength data spanning an area of ~18deg2 and a co-moving volume of ~0.8Gpc3. The five deep fields included in SERVS are the XMM-LSS field, Lockman Hole (LH), ELAIS-N1 (EN1), ELAIS-S1 (ES1), and Chandra Deep Field South (CDFS). SERVS provides NIR, post-cryogenic imaging in the 3.6 and 4.5um Spitzer/IRAC bands to a depth of ~2uJy. IRAC dual-band source catalogs generated using traditional catalog extraction methods are described in Mauduit+ (2012PASP..124..714M). The Spitzer IRAC data are complemented by ground-based NIR observations from the VISTA Deep Extragalactic Observations (VIDEO; Jarvis+ 2013MNRAS.428.1281J) survey in the south in the Z, Y, J, H, and Ks bands and UKIRT Infrared Deep Sky Survey (UKIDSS; Lawrence+ 2007, see II/319) in the north in the J and K bands. SERVS also provides substantial overlap with infrared data from SWIRE (Lonsdale+ 2003PASP..115..897L) and the Herschel Multitiered Extragalactic Survey (HerMES; Oliver+ 2012, VIII/95). As shown in Figure 1, one square degree of the XMM-LSS field overlaps with ground-based optical data from the Canada-France-Hawaii Telescope Legacy Survey Deep field 1 (CFHTLS-D1). The CFHTLS-D1 region is centered at RAJ2000=02:25:59, DEJ2000=-04:29:40 and includes imaging through the filter set u', g', r', i', and z'. Thus, in combination with the NIR data from SERVS and VIDEO that overlap with the CFHTLS-D1 region, multi-band imaging over a total of 12 bands is available. (2 data files).
Investigation of Axial Electric Field Measurements with Grounded-Wire TEM Surveys
NASA Astrophysics Data System (ADS)
Zhou, Nan-nan; Xue, Guo-qiang; Li, Hai; Hou, Dong-yang
2018-01-01
The grounded-wire transient electromagnetic (TEM) surveying is often performed along the equatorial direction with its observation lines paralleling to the transmitting wire with a certain transmitter-receiver distance. However, such method takes into account only the equatorial component of the electromagnetic field, and a little effort has been made on incorporating the other major component along the transmitting wire, here denoted as axial field. To obtain a comprehensive understanding of its fundamental characteristics and guide the designing of the corresponding observation system for reliable anomaly detection, this study for the first time investigates the axial electric field from three crucial aspects, including its decay curve, plane distribution, and anomaly sensitivity, through both synthetic modeling and real application to one major coal field in China. The results demonstrate a higher sensitivity to both high- and low-resistivity anomalies by the electric field in axial direction and confirm its great potentials for robust anomaly detection in the subsurface.
Quantifying post-fire ponderosa pine snags using GIS techniques on scanned aerial photographs
NASA Astrophysics Data System (ADS)
Kent, Kevin
Snags are an important component of forest ecosystems because of their utility in forest-nutrient cycling and provision of critical wildlife habitat, as well as associated fuel management concerns relating to coarse woody debris (CWD). Knowledge of snag and CWD trajectories are needed for land managers to plan for long-term ecosystem change in post-fire regimes. This need will likely be exacerbated by increasingly warm and dry climatic conditions projected for the U.S. Southwest. One of the best prospects for studying fire-induced landscape change beyond the plot scale, but still at a resolution sufficient to resolve individual snags, is to utilize the available aerial photography record. Previous field-based studies of snag and CWD loads in the Southwest have relied on regional chronosequences to judge the recovery dynamic of ponderosa pine (Pinus ponderosa) burns. This previous research has been spatially and temporally restricted because of field survey extent limitations and uncertainty associated with the chronosequence approach (i.e., space-for-time substitution), which does not consider differences between specific site conditions and histories. This study develops highly automated methods for remotely quantifying and characterizing the spatial and temporal distribution of large snags associated with severe forest fires from very high resolution (VHR) landscape imagery I obtained from scans of aerial photos. Associated algorithms utilize the sharp edges, shape, shadow, and contrast characteristics of snags to enable feature recognition. Additionally, using snag shadow length, image acquisition time, and location information, heights were estimated for each identified snag. Furthermore, a novel solution was developed for extracting individual snags from areas of high snag density by overlaying parallel lines in the direction of the snag shadows and extracting local maxima lines contained by each snag polygon. Field survey data coincident to imagery coverage for post-fire ponderosa pine forests allowed calibration and accuracy assessment of these new tools. These new methods may allow for broader estimation of snag dynamics in post fire landscapes while significantly lowering the human and material costs of conducting such surveys. Outcomes for these methods were mixed. Both the snag count and snag height values were chronically underestimated using a feature extraction method and an edge detection method; so, an adjustment constant was developed for the categories of each method. Average accuracies ranged from 54 to 46% lower than the field-based values for the count attribute and 2-12% over the ground values for the snag height attribute using these two approaches. These methods show much promise, and are less resource intensive than field surveys, but more research is needed to improve overall accuracy.
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T; Sakshaug, Joseph W; Aurelien, Guy Alain S
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817
Simulation of scenario earthquake influenced field by using GIS
Zuo, H.-Q.; Xie, L.-L.; Borcherdt, R.D.
1999-01-01
The method for estimating the site effect on ground motion specified by Borcherdt (1994a, 1994b) is briefly introduced in the paper. This method and the detail geological data and site classification data in San Francisco bay area of California, the United States, are applied to simulate the influenced field of scenario earthquake by GIS technology, and the software for simulating has been drawn up. The paper is a partial result of cooperative research project between China Seismological Bureau and US Geological Survey.
Development of the EM tomography system by the vertical electromagnetic profiling (VEMP) method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miura, Y.; Osato, K.; Takasugi, S.
1995-12-31
As a part of the {open_quotes}Deep-Seated Geothermal Resources Survey{close_quotes} project being undertaken by the NEDO, the Vertical ElectroMagnetic Profiling (VEMP) method is being developed to accurately obtain deep resistivity structure. The VEMP method acquires multi-frequency three-component magnetic field data in an open hole well using controlled sources (loop sources or grounded-wire sources) emitted at the surface. Numerical simulation using EM3D demonstrated that phase data of the VEMP method is very sensitive to resistivity structure and the phase data will also indicate presence of deep anomalies. Forward modelling was also used to determine required transmitter moments for various grounded-wire and loopmore » sources for a field test using the WD-1 well in the Kakkonda geothermal area. Field logging of the well was carried out in May 1994 and the processed field data matches well the simulated data.« less
Methods for implementing a medicine outlet survey: lessons from the anti-malarial market.
O'Connell, Kathryn A; Poyer, Stephen; Solomon, Tsione; Munroe, Erik; Patouillard, Edith; Njogu, Julius; Evance, Illah; Hanson, Kara; Shewchuk, Tanya; Goodman, Catherine
2013-02-05
In recent years an increasing number of public investments and policy changes have been made to improve the availability, affordability and quality of medicines available to consumers in developing countries, including anti-malarials. It is important to monitor the extent to which these interventions are successful in achieving their aims using quantitative data on the supply side of the market. There are a number of challenges related to studying supply, including outlet sampling, gaining provider cooperation and collecting accurate data on medicines. This paper provides guidance on key steps to address these issues when conducting a medicine outlet survey in a developing country context. While the basic principles of good survey design and implementation are important for all surveys, there are a set of specific issues that should be considered when conducting a medicine outlet survey. This paper draws on the authors' experience of designing and implementing outlet surveys, including the lessons learnt from ACTwatch outlet surveys on anti-malarial retail supply, and other key studies in the field. Key lessons and points of debate are distilled around the following areas: selecting a sample of outlets; techniques for collecting and analysing data on medicine availability, price and sales volumes; and methods for ensuring high quality data in general. The authors first consider the inclusion criteria for outlets, contrasting comprehensive versus more focused approaches. Methods for developing a reliable sampling frame of outlets are then presented, including use of existing lists, key informants and an outlet census. Specific issues in the collection of data on medicine prices and sales volumes are discussed; and approaches for generating comparable price and sales volume data across products using the adult equivalent treatment dose (AETD) are explored. The paper concludes with advice on practical considerations, including questionnaire design, field worker training, and data collection. Survey materials developed by ACTwatch for investigating anti-malarial markets in sub-Saharan Africa and Asia provide a helpful resource for future studies in this area.
Imaizumi, Yoshitaka; Suzuki, Noriyuki; Shiraishi, Fujio; Nakajima, Daisuke; Serizawa, Shigeko; Sakurai, Takeo; Shiraishi, Hiroaki
2018-01-24
In pesticide risk management in Japan, predicted environmental concentrations are estimated by a tiered approach, and the Ministry of the Environment also performs field surveys to confirm the maximum concentrations of pesticides with risk concerns. To contribute to more efficient and effective field surveys, we developed the Pesticide Chemicals High Resolution Estimation Method (PeCHREM) for estimating spatially and temporally variable emissions of various paddy herbicides from paddy fields to the environment. We used PeCHREM and the G-CIEMS multimedia environmental fate model to predict day-to-day environmental concentration changes of 25 herbicides throughout Japan. To validate the PeCHREM/G-CIEMS model, we also conducted a field survey, in which river waters were sampled at least once every two weeks at seven sites in six prefectures from April to July 2009. In 20 of 139 sampling site-herbicide combinations in which herbicides were detected in at least three samples, all observed concentrations differed from the corresponding prediction by less than one order of magnitude. We also compared peak concentrations and the dates on which the concentrations reached peak values (peak dates) between predictions and observations. The peak concentration differences between predictions and observations were less than one order of magnitude in 66% of the 166 sampling site-herbicide combinations in which herbicide was detected in river water. The observed and predicted peak dates differed by less than two weeks in 79% of these 166 combinations. These results confirm that the PeCHREM/G-CIEMS model can improve the efficiency and effectiveness of surveys by predicting the peak concentrations and peak dates of various herbicides.
NASA Astrophysics Data System (ADS)
Price-Whelan, Adrian M.; Agüeros, Marcel A.; Fournier, Amanda P.; Street, Rachel; Ofek, Eran O.; Covey, Kevin R.; Levitan, David; Laher, Russ R.; Sesar, Branimir; Surace, Jason
2014-01-01
Many photometric time-domain surveys are driven by specific goals, such as searches for supernovae or transiting exoplanets, which set the cadence with which fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several sub-surveys are conducted in parallel, leading to non-uniform sampling over its ~20,000 deg2 footprint. While the median 7.26 deg2 PTF field has been imaged ~40 times in the R band, ~2300 deg2 have been observed >100 times. We use PTF data to study the trade off between searching for microlensing events in a survey whose footprint is much larger than that of typical microlensing searches, but with far-from-optimal time sampling. To examine the probability that microlensing events can be recovered in these data, we test statistics used on uniformly sampled data to identify variables and transients. We find that the von Neumann ratio performs best for identifying simulated microlensing events in our data. We develop a selection method using this statistic and apply it to data from fields with >10 R-band observations, 1.1 × 109 light curves, uncovering three candidate microlensing events. We lack simultaneous, multi-color photometry to confirm these as microlensing events. However, their number is consistent with predictions for the event rate in the PTF footprint over the survey's three years of operations, as estimated from near-field microlensing models. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large data sets, which will be useful to future time-domain surveys, such as that planned with the Large Synoptic Survey Telescope.
Geological Survey research 1976
,
1976-01-01
This U.S. Geological Survey activities report includes a summary of recent (1976 fiscal year) scientific and economic results accompanied by a list of geologic and hydrologic investigations in progress and a report on the status of topographic mapping. The summary of results includes: (1) Mineral resources, Water resources, (2) Engineering geology and hydrology, (3) Regional geology, (4) Principles and processes, (5) Laboratory and field methods, (6) Topographic surveys and mapping, (7) Management of resources on public lands, (8) Land information and analysis, and (9) Investigations in other countries. Also included are lists of cooperating agencies and Geological Survey offices. (Woodard-USGS)
Geological Survey research 1978
,
1978-01-01
This U.S. Geological Survey activities report includes a summary of 1978 fiscal year scientific and economic results accompanied by a list of geologic and hydrologic investigations in progress and a report on the status of topographic mapping. The summary of results includes: (1) Mineral and water resources, (2) Engineering geology and hydrology, (3) Regional geology, (4) Principles and processes, (5) Laboratory and field methods, (6) Topographic surveys and mapping, (7) Management of resources on public lands, (8) Land information and analysis, and (9) Investigations in other countries. Also included are lists of cooperating agencies and Geological Survey offices. (Woodard-USGS)
A Safari Through Density Functional Theory
NASA Astrophysics Data System (ADS)
Dreizler, Reiner M.; Lüdde, Cora S.
Density functional theory is widely used to treat quantum many body problems in many areas of physics and related fields. A brief survey of this method covering foundations, functionals and applications is presented here.
A field survey on coffee beans drying methods of Indonesian small holder farmers
NASA Astrophysics Data System (ADS)
Siagian, Parulian; Setyawan, Eko Y.; Gultom, Tumiur; Napitupulu, Farel H.; Ambarita, Himsar
2017-09-01
Drying agricultural product is a post-harvest process that consumes significant energy. It can affect the quality of the product. This paper deals with literature review and field survey of drying methods of coffee beans of Indonesia farmers. The objective is to supply the necessary information on developing continuous solar drier. The results show that intermittent characteristic of sun drying results in a better quality of coffee beans in comparison with constant convective drying. In order to use energy efficiently, the drying process should be divided into several stages. In the first stage when the moist content is high, higher drying air temperature is more effective. After this step, where the moist content is low, lower drying air temperature is better. The field survey of drying coffee beans in Sumatera Utara province reveals that the used drying process is very traditional. It can be divided into two modes and depend on the coffee beans type. The Arabica coffee is firstly fermented and dried to moisture content of 80% using sun drying method, then followed by Green House model of drying up to moisture content about 12%. The latter typically spends 3 days of drying time. On the other hand, The Robusta coffee is dried by exposing to the sun directly without any treatment. After the coffee beans dried follow by peeled process. These findings can be considered to develop a continuous solar drying that suitable for coffee beans drying.
a Survey on Topics, Researchers and Cultures in the Field of Digital Heritage
NASA Astrophysics Data System (ADS)
Münster, S.
2017-08-01
Digital heritage comprises a broad variety of approaches and topics and involves researchers from multiple disciplines. While the use of digital methods in the text-oriented disciplines dealing with cultural heritage is widely discussed and canonized, an up-to-date investigation on cultural heritage as a scholarly field is currently missing. The extended abstract is about a three-stage investigation on standards, publications, disciplinary cultures as well as scholars in the field of digital heritage, carried out in 2016 and 2017. It includes results of a workshop-based survey involving 44 researchers, 15 qualitative interviews as well as an online survey with nearly 1000 participants. As an overall finding, a community is driven by researchers from European countries and especially Italy with a background in humanities, dealing with topics of data acquisition, data management and visualization. Moreover, conference series are most relevant for a scientific discourse, and especially EU projects set pace as most important research endeavours.
Review of 3d GIS Data Fusion Methods and Progress
NASA Astrophysics Data System (ADS)
Hua, Wei; Hou, Miaole; Hu, Yungang
2018-04-01
3D data fusion is a research hotspot in the field of computer vision and fine mapping, and plays an important role in fine measurement, risk monitoring, data display and other processes. At present, the research of 3D data fusion in the field of Surveying and mapping focuses on the 3D model fusion of terrain and ground objects. This paper summarizes the basic methods of 3D data fusion of terrain and ground objects in recent years, and classified the data structure and the establishment method of 3D model, and some of the most widely used fusion methods are analysed and commented.
NASA Technical Reports Server (NTRS)
1972-01-01
A survey of nondestructive evaluation (NDE) technology, which is discussed in terms of popular demands for a greater degree of quality, reliability, and safety in industrial products, is presented as an overview of the NDE field to serve the needs of middle management. Three NDE methods are presented: acoustic emission, the use of coherent (laser)light, and ultrasonic holography.
Jeffrey, N.; Abdalla, F. B.; Lahav, O.; ...
2018-05-15
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in themore » density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE. [Abridged]« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey, N.; et al.
2018-01-26
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in themore » density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE. [Abridged]« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey, N.; Abdalla, F. B.; Lahav, O.
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in themore » density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE. [Abridged]« less
Social surveys in HIV/AIDS: telling or writing? A comparison of interview and postal methods.
McEwan, R T; Harrington, B E; Bhopal, R S; Madhok, R; McCallum, A
1992-06-01
We compare a probability sample postal questionnaire survey and a quota controlled interview survey, and review the literature on these subjects. In contrast to other studies, where quota samples were not representative because of biased selection of respondents by interviewers, our quota sample was representative. Response rates were similar in our postal and interview surveys (74 and 77%, respectively), although many previous similar postal surveys had poor response rates. As in other comparison studies, costs were higher in our interview survey, substantive responses and the quality of responses to closed-ended questions were similar, and responses to open-ended questions were better in the interview survey. 'Socially unacceptable' responses on sexual behaviour were less likely in interviews. Quota controlled surveys are appropriate in surveys on HIV/AIDS under certain circumstances, e.g. where the population parameters are well known, and where interviewers can gain access to the entire population. Postal questionnaires are better for obtaining information on sexual behaviour, if adequate steps are taken to improve response rates, and when in-depth answers are not needed. For most surveys in the HIV/AIDS field we recommend the postal method.
Designing occupancy studies when false-positive detections occur
Clement, Matthew
2016-01-01
1.Recently, estimators have been developed to estimate occupancy probabilities when false-positive detections occur during presence-absence surveys. Some of these estimators combine different types of survey data to improve estimates of occupancy. With these estimators, there is a tradeoff between the number of sample units surveyed, and the number and type of surveys at each sample unit. Guidance on efficient design of studies when false positives occur is unavailable. 2.For a range of scenarios, I identified survey designs that minimized the mean square error of the estimate of occupancy. I considered an approach that uses one survey method and two observation states and an approach that uses two survey methods. For each approach, I used numerical methods to identify optimal survey designs when model assumptions were met and parameter values were correctly anticipated, when parameter values were not correctly anticipated, and when the assumption of no unmodelled detection heterogeneity was violated. 3.Under the approach with two observation states, false positive detections increased the number of recommended surveys, relative to standard occupancy models. If parameter values could not be anticipated, pessimism about detection probabilities avoided poor designs. Detection heterogeneity could require more or fewer repeat surveys, depending on parameter values. If model assumptions were met, the approach with two survey methods was inefficient. However, with poor anticipation of parameter values, with detection heterogeneity, or with removal sampling schemes, combining two survey methods could improve estimates of occupancy. 4.Ignoring false positives can yield biased parameter estimates, yet false positives greatly complicate the design of occupancy studies. Specific guidance for major types of false-positive occupancy models, and for two assumption violations common in field data, can conserve survey resources. This guidance can be used to design efficient monitoring programs and studies of species occurrence, species distribution, or habitat selection, when false positives occur during surveys.
Pederson, Linda L.; Thorne, Stacy L.; Caraballo, Ralph S.; Evans, Brian; Athey, Leslie; McMichael, Joseph
2010-01-01
Objectives. We sought to modify an instrument and to use it to collect information on smoking knowledge, attitudes, beliefs, and behaviors among Hispanics/Latinos, and to adapt survey methods to obtain high participation levels. Methods. Promotoras (outreach workers) conducted face-to-face interviews with 1485 Hispanic adults (July 2007–April 2008). The project team used GeoFrame field enumeration methods to develop a sampling frame from households in randomly selected colonias (residential areas along the Texas–Mexico border that may lack some basic necessities (e.g. portable water), in El Paso, Texas. Results. The revised questionnaire included 36 unchanged items from the State Adult Tobacco Survey, 7 modified items, and 17 new items focusing on possible culturally specific quitting methods, secondhand smoke issues, and attitudes and knowledge about tobacco use that might be unique for Hispanic/Latino groups. The eligibility rate was 90.2%, and the conservative combined completed screener and interview response rate was 80.0%. Conclusions. Strategic, targeted, carefully designed methods and surveys can achieve high reach and response rates in hard-to-reach populations. Similar procedures could be used to obtain cooperation of groups who may not be accessible with traditional methods. PMID:20147687
A Novel Approach to Measuring Glacier Motion Remotely using Aerial LiDAR
NASA Astrophysics Data System (ADS)
Telling, J. W.; Fountain, A. G.; Glennie, C. L.; Obryk, M.
2016-12-01
Glaciers play an important role in the Earth's climate system, affecting climate and ocean circulation at the largest scales, and contributing to runoff and sea level rise at local scales. A key variable is glacier motion and tracking motion is critical to understanding how flow responds to changes in boundary conditions and to testing predictive models of glacier behavior. Although field measurements of glacier motion have been collected since the 19th Century, field operations remain a slow, laborious, sometimes dangerous, task yielding only a few data points per glacier. In recent decades satellite imaging of glacier motion has proved very fruitful, but the spatial resolution of the imagery restricts applications to regional scale analyses. Here we assess the utility of using aerial LiDAR surveys and particle image velocimetry (PIV) as a method for tracking glacier motion over relatively small regions (<50km2). Five glaciers in Taylor Valley, Antarctica, were surveyed twice; the first LiDAR survey was conducted in 2001 and the second was conducted in 2014. The cold-dry climate conditions of Taylor Valley and the relatively slow motion of its polar glaciers (≤ 8m yr-1) preserve the surface roughness and limit the advected distance of the features making the 13-year interval between surveys sufficient for monitoring glacier motion. Initial results yield reasonable flow fields and show great promise. The range of flow speeds, surface roughness, and transient snow patches found on these glaciers provide a robust test of PIV methods. Results will be compared to field measurements of glacier velocity and to results from feature tracking, a common technique based on paired optical images. The merits of using this technique to measure glacier motion will be discussed in the context of these results. Applying PIV to LiDAR point clouds may offer a higher resolution data set of glacier velocity than satellite images or field measurements.
HerMES: point source catalogues from Herschel-SPIRE observations II
NASA Astrophysics Data System (ADS)
Wang, L.; Viero, M.; Clarke, C.; Bock, J.; Buat, V.; Conley, A.; Farrah, D.; Guo, K.; Heinis, S.; Magdis, G.; Marchetti, L.; Marsden, G.; Norberg, P.; Oliver, S. J.; Page, M. J.; Roehlly, Y.; Roseboom, I. G.; Schulz, B.; Smith, A. J.; Vaccari, M.; Zemcov, M.
2014-11-01
The Herschel Multi-tiered Extragalactic Survey (HerMES) is the largest Guaranteed Time Key Programme on the Herschel Space Observatory. With a wedding cake survey strategy, it consists of nested fields with varying depth and area totalling ˜380 deg2. In this paper, we present deep point source catalogues extracted from Herschel-Spectral and Photometric Imaging Receiver (SPIRE) observations of all HerMES fields, except for the later addition of the 270 deg2 HerMES Large-Mode Survey (HeLMS) field. These catalogues constitute the second Data Release (DR2) made in 2013 October. A sub-set of these catalogues, which consists of bright sources extracted from Herschel-SPIRE observations completed by 2010 May 1 (covering ˜74 deg2) were released earlier in the first extensive data release in 2012 March. Two different methods are used to generate the point source catalogues, the SUSSEXTRACTOR point source extractor used in two earlier data releases (EDR and EDR2) and a new source detection and photometry method. The latter combines an iterative source detection algorithm, STARFINDER, and a De-blended SPIRE Photometry algorithm. We use end-to-end Herschel-SPIRE simulations with realistic number counts and clustering properties to characterize basic properties of the point source catalogues, such as the completeness, reliability, photometric and positional accuracy. Over 500 000 catalogue entries in HerMES fields (except HeLMS) are released to the public through the HeDAM (Herschel Database in Marseille) website (http://hedam.lam.fr/HerMES).
Underwater MASW to evaluate stiffness of water-bottom sediments
Park, C.B.; Miller, R.D.; Xia, J.; Ivanov, J.; Sonnichsen, G.V.; Hunter, J.A.; Good, R.L.; Burns, R.A.; Christian, H.
2005-01-01
The multichannel analysis of surface waves (MASW) is initially intended as a land survey method to investigate the near-surface materials for their elastic properties. The acquired data are first analyzed for dispersion characteristics and, from these the shear-wave velocity is estimated using an inversion technique. Land applications show the potential of the MASW method to map 2D bedrock surface, zones of low strength, Poisson's ratio, voids, as well as to generate shear-wave profiles for various othe geotechnical problems. An overview is given of several underwater applications of the MASW method to characterize stiffness distribution of water-bottom sediments. The first application details the survey under shallow-water (1-6 m) in the Fraser River (Canada). The second application is an innovative experimental marine seismic survey in the North Atlantic Ocean near oil fields in Grand Bank offshore Newfoundland.
The Chandra Xbootes Survey - IV: Mid-Infrared and Submillimeter Counterparts
NASA Astrophysics Data System (ADS)
Brown, Arianna; Mitchell-Wynne, Ketron; Cooray, Asantha R.; Nayyeri, Hooshang
2016-06-01
In this work, we use a Bayesian technique to identify mid-IR and submillimeter counterparts for 3,213 X-ray point sources detected in the Chandra XBoötes Survey so as to characterize the relationship between black hole activity and star formation in the XBoötes region. The Chandra XBoötes Survey is a 5-ks X-ray survey of the 9.3 square degree Boötes Field of the NOAO Deep Wide-Field Survey (NDWFS), a survey imaged from the optical to the near-IR. We use a likelihood ratio analysis on Spitzer-IRAC data taken from The Spitzer Deep, Wide-Field Survey (SDWFS) to determine mid-IR counterparts, and a similar method on Herschel-SPIRE sources detected at 250µm from The Herschel Multi-tiered Extragalactic Survey to determine the submillimeter counterparts. The likelihood ratio analysis (LRA) provides the probability that a(n) IRAC or SPIRE point source is the true counterpart to a Chandra source. The analysis is comprised of three parts: the normalized magnitude distributions of counterparts and background sources, and the radial probability distribution of the separation distance between the IRAC or SPIRE source and the Chandra source. Many Chandra sources have multiple prospective counterparts in each band, so additional analysis is performed to determine the identification reliability of the candidates. Identification reliability values lie between 0 and 1, and sources with identification reliability values ≥0.8 are chosen to be the true counterparts. With these results, we will consider the statistical implications of the sample's redshifts, mid-IR and submillimeter luminosities, and star formation rates.
Near-UV Sources in the Hubble Ultra Deep Field: The Catalog
NASA Technical Reports Server (NTRS)
Gardner, Jonathan P.; Voyrer, Elysse; de Mello, Duilia F.; Siana, Brian; Quirk, Cori; Teplitz, Harry I.
2009-01-01
The catalog from the first high resolution U-band image of the Hubble Ultra Deep Field, taken with Hubble s Wide Field Planetary Camera 2 through the F300W filter, is presented. We detect 96 U-band objects and compare and combine this catalog with a Great Observatories Origins Deep Survey (GOODS) B-selected catalog that provides B, V, i, and z photometry, spectral types, and photometric redshifts. We have also obtained Far-Ultraviolet (FUV, 1614 Angstroms) data with Hubble s Advanced Camera for Surveys Solar Blind Channel (ACS/SBC) and with Galaxy Evolution Explorer (GALEX). We detected 31 sources with ACS/SBC, 28 with GALEX/FUV, and 45 with GALEX/NUV. The methods of observations, image processing, object identification, catalog preparation, and catalog matching are presented.
ERIC Educational Resources Information Center
Cho, Yonjoo; Jo, Sung Jun; Park, Sunyoung; Kang, Ingu; Chen, Zengguan
2011-01-01
This study conducted a citation network analysis (CNA) of human performance technology (HPT) to examine its current state of the field. Previous reviews of the field have used traditional research methods, such as content analysis, survey, Delphi, and citation analysis. The distinctive features of CNA come from using a social network analysis…
ERIC Educational Resources Information Center
Smith, Ryan C.; Bowdring, Molly A.; Geller, E. Scott
2015-01-01
Objective: The determinants of alcohol consumption among university students were investigated in a downtown field setting with blood alcohol content (BAC) as the dependent variable. Participants: In total, 521 participants completed a brief survey and had their BAC assessed during April 2013. Methods: Between 10:00 pm and 2:00 am, teams of…
NASA Astrophysics Data System (ADS)
Alkhatib Alkontar, Rozan; Calou, Paul; Rohmer, Jérôme; Munschy, Marc
2017-04-01
Among the surface methods of exploration that have been developed to meet the new requirements of archaeological research, geophysical methods offer a wide range of applications in the study of buried deposits. As a result of the most recent development, the magnetic field- prospection method is very efficient to highlight buried foundations even if the corresponding construction material is weakly magnetized like, for example, limestone. The magnetic field, that is being measured in a specific place and at a specific time, is the vector sum of the main regional magnetic field, the effect of subsurface structures, the temporal variation (mainly solar influence) and local disturbances such as power lines, buildings, fences … The measurement method is implemented by using an array of fluxgate 3-components magnetometers carried 1 m above the floor. The advantage of using vector magnetometers is that magnetic compensation can be of achieved. An array of four magnetometers are used to survey the archaeological site of Thaje (100-300 yr BC), Saudi Arabia, and to obtain a precise location of measurements, a differential global navigation satellite system is used with an accuracy of about 10 cm relative to the base station. 25 hectares have been surveyed within 13 days and data are compile to obtain a total magnetic intensity map with a node spacing of 25 cm. The map is interpreted using magnetic field transforms, such as reduction to the pole, fractional vertical derivatives. Tilt-angle. The results show a fairly precise plan of the city where main streets, buildings and rampart can be clearly distinguished.
Martin, Jeffrey D.; Norman, Julia E.; Sandstrom, Mark W.; Rose, Claire E.
2017-09-06
U.S. Geological Survey monitoring programs extensively used two analytical methods, gas chromatography/mass spectrometry and liquid chromatography/mass spectrometry, to measure pesticides in filtered water samples during 1992–2012. In October 2012, the monitoring programs began using direct aqueous-injection liquid chromatography tandem mass spectrometry as a new analytical method for pesticides. The change in analytical methods, however, has the potential to inadvertently introduce bias in analysis of datasets that span the change.A field study was designed to document performance of the new method in a variety of stream-water matrices and to quantify any potential changes in measurement bias or variability that could be attributed to changes in analytical methods. The goals of the field study were to (1) summarize performance (bias and variability of pesticide recovery) of the new method in a variety of stream-water matrices; (2) compare performance of the new method in laboratory blank water (laboratory reagent spikes) to that in a variety of stream-water matrices; (3) compare performance (analytical recovery) of the new method to that of the old methods in a variety of stream-water matrices; (4) compare pesticide detections and concentrations measured by the new method to those of the old methods in a variety of stream-water matrices; (5) compare contamination measured by field blank water samples in old and new methods; (6) summarize the variability of pesticide detections and concentrations measured by the new method in field duplicate water samples; and (7) identify matrix characteristics of environmental water samples that adversely influence the performance of the new method. Stream-water samples and a variety of field quality-control samples were collected at 48 sites in the U.S. Geological Survey monitoring networks during June–September 2012. Stream sites were located across the United States and included sites in agricultural and urban land-use settings, as well as sites on major rivers.The results of the field study identified several challenges for the analysis and interpretation of data analyzed by both old and new methods, particularly when data span the change in methods and are combined for analysis of temporal trends in water quality. The main challenges identified are large (greater than 30 percent), statistically significant differences in analytical recovery, detection capability, and (or) measured concentrations for selected pesticides. These challenges are documented and discussed, but specific guidance or statistical methods to resolve these differences in methods are beyond the scope of the report. The results of the field study indicate that the implications of the change in analytical methods must be assessed individually for each pesticide and method.Understanding the possible causes of the systematic differences in concentrations between methods that remain after recovery adjustment might be necessary to determine how to account for the differences in data analysis. Because recoveries for each method are independently determined from separate reference standards and spiking solutions, the differences might be due to an error in one of the reference standards or solutions or some other basic aspect of standard procedure in the analytical process. Further investigation of the possible causes is needed, which will lead to specific decisions on how to compensate for these differences in concentrations in data analysis. In the event that further investigations do not provide insight into the causes of systematic differences in concentrations between methods, the authors recommend continuing to collect and analyze paired environmental water samples by both old and new methods. This effort should be targeted to seasons, sites, and expected concentrations to supplement those concentrations already assessed and to compare the ongoing analytical recovery of old and new methods to those observed in the summer and fall of 2012.
Hardigan, Patrick C; Popovici, Ioana; Carvajal, Manuel J
2016-01-01
There is a gap between increasing demands from pharmacy journals, publishers, and reviewers for high survey response rates and the actual responses often obtained in the field by survey researchers. Presumably demands have been set high because response rates, times, and costs affect the validity and reliability of survey results. Explore the extent to which survey response rates, average response times, and economic costs are affected by conditions under which pharmacist workforce surveys are administered. A random sample of 7200 U.S. practicing pharmacists was selected. The sample was stratified by delivery method, questionnaire length, item placement, and gender of respondent for a total of 300 observations within each subgroup. A job satisfaction survey was administered during March-April 2012. Delivery method was the only classification showing significant differences in response rates and average response times. The postal mail procedure accounted for the highest response rates of completed surveys, but the email method exhibited the quickest turnaround. A hybrid approach, consisting of a combination of postal and electronic means, showed the least favorable results. Postal mail was 2.9 times more cost effective than the email approach and 4.6 times more cost effective than the hybrid approach. Researchers seeking to increase practicing pharmacists' survey participation and reduce response time and related costs can benefit from the analytical procedures tested here. Copyright © 2016 Elsevier Inc. All rights reserved.
Simultaneous stochastic inversion for geomagnetic main field and secular variation. II - 1820-1980
NASA Technical Reports Server (NTRS)
Bloxham, Jeremy; Jackson, Andrew
1989-01-01
With the aim of producing readable time-dependent maps of the geomagnetic field at the core-mantle boundary, the method of simultaneous stochastic inversion for the geomagnetic main field and secular variation, described by Bloxham (1987), was applied to survey data from the period 1820-1980 to yield two time-dependent geomagnetic-field models, one for the period 1900-1980 and the other for 1820-1900. Particular consideration was given to the effect of crustal fields on observations. It was found that the existing methods of accounting for these fields as sources of random noise are inadequate in two circumstances: (1) when sequences of measurements are made at one particular site, and (2) for measurements made at satellite altitude. The present model shows many of the features in the earth's magnetic field at the core-mantle boundary described by Bloxham and Gubbins (1985) and supports many of their earlier conclusions.
Scenario Evaluator for Electrical Resistivity survey pre-modeling tool
Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.
2017-01-01
Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.
Smith, Bruce D.; Thamke, Joanna N.; Cain, Michael J.; Tyrrell, Christa; Hill, Patricia L.
2006-01-01
This report is a data release for a helicopter electromagnetic and magnetic survey that was conducted during August 2004 in a 275-square-kilometer area that includes the East Poplar oil field on the Fort Peck Indian Reservation. The electromagnetic equipment consisted of six different coil-pair orientations that measured resistivity at separate frequencies from about 400 hertz to about 140,000 hertz. The electromagnetic resistivity data were converted to six electrical conductivity grids, each representing different approximate depths of investigation. The range of subsurface investigation is comparable to the depth of shallow aquifers. Areas of high conductivity in shallow aquifers in the East Poplar oil field area are being delineated by the U.S. Geological Survey, in cooperation with the Fort Peck Assiniboine and Sioux Tribes, in order to map areas of saline-water plumes. Ground electromagnetic methods were first used during the early 1990s to delineate more than 31 square kilometers of high conductivity saline-water plumes in a portion of the East Poplar oil field area. In the 10 years since the first delineation, the quality of water from some wells completed in the shallow aquifers in the East Poplar oil field changed markedly. The extent of saline-water plumes in 2004 likely differs from that delineated in the early 1990s. The geophysical and hydrologic information from U.S. Geological Survey studies is being used by resource managers to develop ground-water resource plans for the area.
Particle and flow field holography: A critical survey
NASA Technical Reports Server (NTRS)
Trolinger, James D.
1987-01-01
A brief background is provided for the fields of particle and flow visualization holography. A summary of methods currently in use is given, followed by a discussion of more recent and unique applications. The problem of data reduction is discussed. A state of the art summary is then provided with a prognosis of the future of the field. Particle and flow visualization holography are characterized as powerful tools currently in wide use and with significant untapped potential.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tibuleac, Ileana
2016-06-30
A new, cost effective and non-invasive exploration method using ambient seismic noise has been tested at Soda Lake, NV, with promising results. The material included in this report demonstrates that, with the advantage of initial S-velocity models estimated from ambient noise surface waves, the seismic reflection survey, although with lower resolution, reproduces the results of the active survey when the ambient seismic noise is not contaminated by strong cultural noise. Ambient noise resolution is less at depth (below 1000m) compared to the active survey. In general, the results are promising and useful information can be recovered from ambient seismic noise,more » including dipping features and fault locations.« less
NASA Astrophysics Data System (ADS)
Pattle, Kate; Ward-Thompson, Derek; Hasegawa, Tetsuo; Bastien, Pierre; Kwon, Woojin; Lai, Shih-Ping; Qiu, Keping; Furuya, Ray; Berry, David; JCMT BISTRO Survey Team
2018-06-01
We present the first high-resolution, submillimeter-wavelength polarimetric observations of—and thus direct observations of the magnetic field morphology within—the dense gas of the Pillars of Creation in M16. These 850 μm observations, taken as part of the B-Fields in Star-forming Region Observations Survey (BISTRO) using the POL-2 polarimeter on the Submillimeter Common-User Bolometer Array 2 (SCUBA-2) camera on the James Clerk Maxwell Telescope (JCMT), show that the magnetic field runs along the length of the Pillars, perpendicular to and decoupled from the field in the surrounding photoionized cloud. Using the Chandrasekhar–Fermi method we estimate a plane-of-sky magnetic field strength of 170–320 μG in the Pillars, consistent with their having been formed through the compression of gas with initially weak magnetization. The observed magnetic field strength and morphology suggests that the magnetic field may be slowing the Pillars’ evolution into cometary globules. We thus hypothesize that the evolution and lifetime of the Pillars may be strongly influenced by the strength of the coupling of their magnetic field to that of their parent photoionized cloud—i.e., that the Pillars’ longevity results from magnetic support.
NASA Astrophysics Data System (ADS)
Leker, Lindsey Beth
Stereotype threat is a widely researched phenomenon shown to impact performance in testing and evaluation situations (Katz, Roberts, & Robinson, 1965; Steele & Aronson, 1995). When related to gender, stereotype threat can lead women to score lower than men on standardized math exams (Spencer, Steele, & Quinn, 1999). Stereotype threat may be one reason women have lower enrollment in most science, technology, engineering, and mathematics (STEM) majors, hold a smaller number of STEM careers than men, and have a higher attrition rate in STEM professions (Hill, Corbet, & Rose, 2010; Picho & Brown 2011; Sorby & Baartmans, 2000). Most research has investigated stereotype threat using experiments yielding mixed results (Stoet & Geary, 2012). Thus, there is a need to explore stereotype threat using quantitative surveys and qualitative methods to examine other contextual factors that contribute to gender difference in STEM fields. This dissertation outlined a mixed methods study designed to, first, qualitatively explore stereotype threat and contextual factors related to high achieving women in STEM fields, as well as women who have failed and/or avoided STEM fields. Then, the quantitative portion of the study used the themes from the qualitative phase to create a survey that measured stereotype threat and other contextual variables related to STEM success and failure/avoidance. Fifteen participants were interviewed for the qualitative phase of the study and six themes emerged. The quantitative survey was completed 242 undergraduate participants. T-tests, correlations, regressions, and mediation analyses were used to analyze the data. There were significant relationships between stereotype threat and STEM confidence, STEM anxiety, giving up in STEM, and STEM achievement. Overall, this mixed methods study advanced qualitative research on stereotype threat, developed a much-needed scale for the measurement of stereotype threat, and tested the developed scale.
NASA Astrophysics Data System (ADS)
Jeffrey, N.; Abdalla, F. B.; Lahav, O.; Lanusse, F.; Starck, J.-L.; Leonard, A.; Kirk, D.; Chang, C.; Baxter, E.; Kacprzak, T.; Seitz, S.; Vikram, V.; Whiteway, L.; Abbott, T. M. C.; Allam, S.; Avila, S.; Bertin, E.; Brooks, D.; Rosell, A. Carnero; Kind, M. Carrasco; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Davis, C.; De Vicente, J.; Desai, S.; Doel, P.; Eifler, T. F.; Evrard, A. E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; Hartley, W. G.; Honscheid, K.; Hoyle, B.; James, D. J.; Jarvis, M.; Kuehn, K.; Lima, M.; Lin, H.; March, M.; Melchior, P.; Menanteau, F.; Miquel, R.; Plazas, A. A.; Reil, K.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.
2018-05-01
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion, not accounting for survey masks or noise. The Wiener filter is well-motivated for Gaussian density fields in a Bayesian framework. GLIMPSE uses sparsity, aiming to reconstruct non-linearities in the density field. We compare these methods with several tests using public Dark Energy Survey (DES) Science Verification (SV) data and realistic DES simulations. The Wiener filter and GLIMPSE offer substantial improvements over smoothed KS with a range of metrics. Both the Wiener filter and GLIMPSE convergence reconstructions show a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated ΛCDM shear catalogues and catalogues with no mass fluctuations (a standard data vector when inferring cosmology from peak statistics); the maximum signal-to-noise of these peak statistics is increased by a factor of 3.5 for the Wiener filter and 9 for GLIMPSE. With simulations we measure the reconstruction of the harmonic phases; the phase residuals' concentration is improved 17% by GLIMPSE and 18% by the Wiener filter. The correlation between reconstructions from data and foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE.
Influence of survey strategy and interpolation model on DEM quality
NASA Astrophysics Data System (ADS)
Heritage, George L.; Milan, David J.; Large, Andrew R. G.; Fuller, Ian C.
2009-11-01
Accurate characterisation of morphology is critical to many studies in the field of geomorphology, particularly those dealing with changes over time. Digital elevation models (DEMs) are commonly used to represent morphology in three dimensions. The quality of the DEM is largely a function of the accuracy of individual survey points, field survey strategy, and the method of interpolation. Recommendations concerning field survey strategy and appropriate methods of interpolation are currently lacking. Furthermore, the majority of studies to date consider error to be uniform across a surface. This study quantifies survey strategy and interpolation error for a gravel bar on the River Nent, Blagill, Cumbria, UK. Five sampling strategies were compared: (i) cross section; (ii) bar outline only; (iii) bar and chute outline; (iv) bar and chute outline with spot heights; and (v) aerial LiDAR equivalent, derived from degraded terrestrial laser scan (TLS) data. Digital Elevation Models were then produced using five different common interpolation algorithms. Each resultant DEM was differentiated from a terrestrial laser scan of the gravel bar surface in order to define the spatial distribution of vertical and volumetric error. Overall triangulation with linear interpolation (TIN) or point kriging appeared to provide the best interpolators for the bar surface. Lowest error on average was found for the simulated aerial LiDAR survey strategy, regardless of interpolation technique. However, comparably low errors were also found for the bar-chute-spot sampling strategy when TINs or point kriging was used as the interpolator. The magnitude of the errors between survey strategy exceeded those found between interpolation technique for a specific survey strategy. Strong relationships between local surface topographic variation (as defined by the standard deviation of vertical elevations in a 0.2-m diameter moving window), and DEM errors were also found, with much greater errors found at slope breaks such as bank edges. A series of curves are presented that demonstrate these relationships for each interpolation and survey strategy. The simulated aerial LiDAR data set displayed the lowest errors across the flatter surfaces; however, sharp slope breaks are better modelled by the morphologically based survey strategy. The curves presented have general application to spatially distributed data of river beds and may be applied to standard deviation grids to predict spatial error within a surface, depending upon sampling strategy and interpolation algorithm.
Vertical Cable Seismic Survey for Hydrothermal Deposit
NASA Astrophysics Data System (ADS)
Asakawa, E.; Murakami, F.; Sekino, Y.; Okamoto, T.; Ishikawa, K.; Tsukahara, H.; Shimura, T.
2012-04-01
The vertical cable seismic is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. This type of survey is generally called VCS (Vertical Cable Seismic). Because VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed the method for the hydrothermal deposit survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We are now developing a VCS system, including not only data acquisition hardware but data processing and analysis technique. Our first experiment of VCS surveys has been carried out in Lake Biwa, JAPAN in November 2009 for a feasibility study. Prestack depth migration is applied to the 3D VCS data to obtain a high quality 3D depth volume. Based on the results from the feasibility study, we have developed two autonomous recording VCS systems. After we carried out a trial experiment in the actual ocean at a water depth of about 400m and we carried out the second VCS survey at Iheya Knoll with a deep-towed source. In this survey, we could establish the procedures for the deployment/recovery of the system and could examine the locations and the fluctuations of the vertical cables at a water depth of around 1000m. The acquired VCS data clearly shows the reflections from the sub-seafloor. Through the experiment, we could confirm that our VCS system works well even in the severe circumstances around the locations of seafloor hydrothermal deposits. We have, however, also confirmed that the uncertainty in the locations of the source and of the hydrophones could lower the quality of subsurface image. It is, therefore, strongly necessary to develop a total survey system that assures a accurate positioning and a deployment techniques. We have carried out two field surveys in FY2011. One is a 3D survey with a boomer for a high-resolution surface source and the other one for an actual field survey in the Izena Cauldron an active hydrothermal area in the Okinawa Trough. Through these surveys, the VCS will become a practical exploration tool for the exploration of seafloor hydrothermal deposits.
NASA Astrophysics Data System (ADS)
Blachowski, Jan; Łuczak, Jakub; Zagrodnik, Paulina
2018-01-01
Public participation geographic information system (GIS) and participatory mapping data collection methods are means that enhance capacity in generating, managing, and communicating spatial information in various fields ranging from local planning to environmental management. In this study these methods have been used in two ways. The first one, to gather information on the additional functionality of campus web map expected by its potential users, i.e. students, staff and visitors, through web based survey. The second, to collect geographically referenced information on campus areas that are liked and disliked in a geo-survey carried out with ArcGIS Online GeoForm Application. The results of the first survey were used to map facilities such as: bicycle infrastructure, building entrances, wheelchair accessible infrastructure and benches. The results of the second one, to analyse the most and the least attractive parts of the campus with heat and hot spot analyses in GIS. In addition, the answers have been studied with regard to the visual and functional aspects of campus area raised in the survey. The thematic layers developed in the results of field mapping and geoprocessing of geosurvey data were included in the campus web map project. The paper describes the applied methodology of data collection, processing, analysis, interpretation and geovisualisation.
General field and office procedures for indirect discharge measurements
Benson, M.A.; Dalrymple, Tate
2001-04-01
The discharge of streams is usually measured by the current-meter method. During flood periods, however, it is frequently impossible or impractical to measure the discharges by this method when they occur. Consequently, many peak discharges must be determined after the passage of the flood by indirect methods, such as slope-area, contracted-opening, flow-over-dam, and flow-through-culvert, rather than by direct current-meter measurement. Indirect methods of determining peak discharge are based on hydraulic equations which relate the discharge to the water-surface profile and the geometry of the channel. A field survey is made after the flood to determine the location and elevation of high-water marks and the characteristics of the channel. Detailed descriptions of the general procedures used in collecting the field data and in computing the discharge are given in this report. Each of the methods requires special procedures described in subsequent chapters.
NASA Astrophysics Data System (ADS)
Chen, Weiying; Xue, Guoqiang; Khan, Muhammad Younis; Li, Hai
2017-03-01
Huoqiu iron deposit is a typical Precambrian banded iron-formation (BIF) field which is located in the North China Craton (NCC). To detect the deep ore bodies around Dawangzhuang Village in Yingshang County, north of the Huoqiu deposit field, electromagnetic methods were tested. As the ore bodies are buried under very thick conductive Quaternary sediments, the use of EM methods is a great challenge. Short-offset transient electromagnetic method (SOTEM) was applied in the area as we wanted to test due to its detection depth and resolution. A 2D model was first built according to the geology information and magnetic measurement results. Then, 2D forward and 1D inversion were carried out using FDTD and Occam's algorithm, respectively. The synthetic modeling results helped us with the survey design and interpretation. Two 1400-m-long survey lines with offset of 500 and 1000 m were laid perpendicular to the BIF's strike, and the transmitting parameters were selected by a test measurement at the vicinity of a local village. Finally, the structure of survey area and BIF bodies were determined based on the 1D inversion results of real data, and showed a consistency with the subsequent drill results. Our application of SOTEM in detecting hidden BIF buried under very thick conductive layer has shown that the method is capable of penetrating great depth more than 1000 m even in a very conductive environment and will be an effective tool for deep resources investigation.
Amoutzopoulos, B; Steer, T; Roberts, C; Cade, J E; Boushey, C J; Collins, C E; Trolle, E; de Boer, E J; Ziauddeen, N; van Rossum, C; Buurma, E; Coyle, D; Page, P
2018-01-01
The aim of the present paper is to summarise current and future applications of dietary assessment technologies in nutrition surveys in developed countries. It includes the discussion of key points and highlights of subsequent developments from a panel discussion to address strengths and weaknesses of traditional dietary assessment methods (food records, FFQ, 24 h recalls, diet history with interviewer-assisted data collection) v. new technology-based dietary assessment methods (web-based and mobile device applications). The panel discussion 'Traditional methods v. new technologies: dilemmas for dietary assessment in population surveys', was held at the 9th International Conference on Diet and Activity Methods (ICDAM9), Brisbane, September 2015. Despite respondent and researcher burden, traditional methods have been most commonly used in nutrition surveys. However, dietary assessment technologies offer potential advantages including faster data processing and better data quality. This is a fast-moving field and there is evidence of increasing demand for the use of new technologies amongst the general public and researchers. There is a need for research and investment to support efforts being made to facilitate the inclusion of new technologies for rapid, accurate and representative data.
Determining open cluster membership. A Bayesian framework for quantitative member classification
NASA Astrophysics Data System (ADS)
Stott, Jonathan J.
2018-01-01
Aims: My goal is to develop a quantitative algorithm for assessing open cluster membership probabilities. The algorithm is designed to work with single-epoch observations. In its simplest form, only one set of program images and one set of reference images are required. Methods: The algorithm is based on a two-stage joint astrometric and photometric assessment of cluster membership probabilities. The probabilities were computed within a Bayesian framework using any available prior information. Where possible, the algorithm emphasizes simplicity over mathematical sophistication. Results: The algorithm was implemented and tested against three observational fields using published survey data. M 67 and NGC 654 were selected as cluster examples while a third, cluster-free, field was used for the final test data set. The algorithm shows good quantitative agreement with the existing surveys and has a false-positive rate significantly lower than the astrometric or photometric methods used individually.
Evaluating the perennial stream using logistic regression in central Taiwan
NASA Astrophysics Data System (ADS)
Ruljigaljig, T.; Cheng, Y. S.; Lin, H. I.; Lee, C. H.; Yu, T. T.
2014-12-01
This study produces a perennial stream head potential map, based on a logistic regression method with a Geographic Information System (GIS). Perennial stream initiation locations, indicates the location of the groundwater and surface contact, were identified in the study area from field survey. The perennial stream potential map in central Taiwan was constructed using the relationship between perennial stream and their causative factors, such as Catchment area, slope gradient, aspect, elevation, groundwater recharge and precipitation. Here, the field surveys of 272 streams were determined in the study area. The areas under the curve for logistic regression methods were calculated as 0.87. The results illustrate the importance of catchment area and groundwater recharge as key factors within the model. The results obtained from the model within the GIS were then used to produce a map of perennial stream and estimate the location of perennial stream head.
[SWOT Analysis of the National Survey on Current Status of Major Human Parasitic Diseases in China].
ZHU, Hui-hui; ZHOU, Chang-hai; CHEN, Ying-dan; ZANG, Wei; XIAO, Ning; ZHOU, Xiao-nong
2015-10-01
The National Survey on Current Status of Major Human Parasitic Diseases in China has been carried out since 2014 under the organization of the National Health and Family Planning Commission of the People's Republic of China. The National Institute of Parasitic Diseases, Chinese Center for Disease Control and Prevention (NIPD, China CDC) provided technical support and was responsible for quality control in this survey. This study used SWOT method to analyze the strengths, weaknesses, opportunities and threats that were encountered by he NIPD, China CDC during the completion of the survey. Accordingly, working strategies were proposed to facilitate the future field work.
Accuracy and precision of stream reach water surface slopes estimated in the field and from maps
Isaak, D.J.; Hubert, W.A.; Krueger, K.L.
1999-01-01
The accuracy and precision of five tools used to measure stream water surface slope (WSS) were evaluated. Water surface slopes estimated in the field with a clinometer or from topographic maps used in conjunction with a map wheel or geographic information system (GIS) were significantly higher than WSS estimated in the field with a surveying level (biases of 34, 41, and 53%, respectively). Accuracy of WSS estimates obtained with an Abney level did not differ from surveying level estimates, but conclusions regarding the accuracy of Abney levels and clinometers were weakened by intratool variability. The surveying level estimated WSS most precisely (coefficient of variation [CV] = 0.26%), followed by the GIS (CV = 1.87%), map wheel (CV = 6.18%), Abney level (CV = 13.68%), and clinometer (CV = 21.57%). Estimates of WSS measured in the field with an Abney level and estimated for the same reaches with a GIS used in conjunction with l:24,000-scale topographic maps were significantly correlated (r = 0.86), but there was a tendency for the GIS to overestimate WSS. Detailed accounts of the methods used to measure WSS and recommendations regarding the measurement of WSS are provided.
Herpetological Monitoring Using a Pitfall Trapping Design in Southern California
Fisher, Robert; Stokes, Drew; Rochester, Carlton; Brehme, Cheryl; Hathaway, Stacie; Case, Ted
2008-01-01
The steps necessary to conduct a pitfall trapping survey for small terrestrial vertebrates are presented. Descriptions of the materials needed and the methods to build trapping equipment from raw materials are discussed. Recommended data collection techniques are given along with suggested data fields. Animal specimen processing procedures, including toe- and scale-clipping, are described for lizards, snakes, frogs, and salamanders. Methods are presented for conducting vegetation surveys that can be used to classify the environment associated with each pitfall trap array. Techniques for data storage and presentation are given based on commonly use computer applications. As with any study, much consideration should be given to the study design and methods before beginning any data collection effort.
Arienzo, Alyexandra; Sobze, Martin Sanou; Wadoum, Raoul Emeric Guetiya; Losito, Francesca; Colizzi, Vittorio; Antonini, Giovanni
2015-08-25
According to the World Health Organization (WHO) guidelines, "safe drinking-water must not represent any significant risk to health over a lifetime of consumption, including different sensitivities that may occur between life stages". Traditional methods of water analysis are usually complex, time consuming and require an appropriately equipped laboratory, specialized personnel and expensive instrumentation. The aim of this work was to apply an alternative method, the Micro Biological Survey (MBS), to analyse for contaminants in drinking water. Preliminary experiments were carried out to demonstrate the linearity and accuracy of the MBS method and to verify the possibility of using the evaluation of total coliforms in 1 mL of water as a sufficient parameter to roughly though accurately determine water microbiological quality. The MBS method was then tested "on field" to assess the microbiological quality of water sources in the city of Douala (Cameroon, Central Africa). Analyses were performed on both dug and drilled wells in different periods of the year. Results confirm that the MBS method appears to be a valid and accurate method to evaluate the microbiological quality of many water sources and it can be of valuable aid in developing countries.
A new probe of the magnetic field power spectrum in cosmic web filaments
NASA Astrophysics Data System (ADS)
Hales, Christopher A.; Greiner, Maksim; Ensslin, Torsten A.
2015-08-01
Establishing the properties of magnetic fields on scales larger than galaxy clusters is critical for resolving the unknown origin and evolution of galactic and cluster magnetism. More generally, observations of magnetic fields on cosmic scales are needed for assessing the impacts of magnetism on cosmology, particle physics, and structure formation over the full history of the Universe. However, firm observational evidence for magnetic fields in large scale structure remains elusive. In an effort to address this problem, we have developed a novel statistical method to infer the magnetic field power spectrum in cosmic web filaments using observation of the two-point correlation of Faraday rotation measures from a dense grid of extragalactic radio sources. Here we describe our approach, which embeds and extends the pioneering work of Kolatt (1998) within the context of Information Field Theory (a statistical theory for Bayesian inference on spatially distributed signals; Enfllin et al., 2009). We describe prospects for observation, for example with forthcoming data from the ultra-deep JVLA CHILES Con Pol survey and future surveys with the SKA.
NASA Astrophysics Data System (ADS)
Wolf, C.; Johnson, A. S.; Bilicki, M.; Blake, C.; Amon, A.; Erben, T.; Glazebrook, K.; Heymans, C.; Hildebrandt, H.; Joudaki, S.; Klaes, D.; Kuijken, K.; Lidman, C.; Marin, F.; Parkinson, D.; Poole, G.
2017-04-01
We present a new training set for estimating empirical photometric redshifts of galaxies, which was created as part of the 2-degree Field Lensing Survey project. This training set is located in a ˜700 deg2 area of the Kilo-Degree-Survey South field and is randomly selected and nearly complete at r < 19.5. We investigate the photometric redshift performance obtained with ugriz photometry from VST-ATLAS and W1/W2 from WISE, based on several empirical and template methods. The best redshift errors are obtained with kernel-density estimation (KDE), as are the lowest biases, which are consistent with zero within statistical noise. The 68th percentiles of the redshift scatter for magnitude-limited samples at r < (15.5, 17.5, 19.5) are (0.014, 0.017, 0.028). In this magnitude range, there are no known ambiguities in the colour-redshift map, consistent with a small rate of redshift outliers. In the fainter regime, the KDE method produces p(z) estimates per galaxy that represent unbiased and accurate redshift frequency expectations. The p(z) sum over any subsample is consistent with the true redshift frequency plus Poisson noise. Further improvements in redshift precision at r < 20 would mostly be expected from filter sets with narrower passbands to increase the sensitivity of colours to small changes in redshift.
A geologic approach to field methods in fluvial geomorphology
Fitzpatrick, Faith A.; Thornbush, Mary J; Allen, Casey D; Fitzpatrick, Faith A.
2014-01-01
A geologic approach to field methods in fluvial geomorphology is useful for understanding causes and consequences of past, present, and possible future perturbations in river behavior and floodplain dynamics. Field methods include characterizing river planform and morphology changes and floodplain sedimentary sequences over long periods of time along a longitudinal river continuum. Techniques include topographic and bathymetric surveying of fluvial landforms in valley bottoms and describing floodplain sedimentary sequences through coring, trenching, and examining pits and exposures. Historical sediment budgets that include floodplain sedimentary records can characterize past and present sources and sinks of sediment along a longitudinal river continuum. Describing paleochannels and floodplain vertical accretion deposits, estimating long-term sedimentation rates, and constructing historical sediment budgets can assist in management of aquatic resources, habitat, sedimentation, and flooding issues.
Automated face detection for occurrence and occupancy estimation in chimpanzees.
Crunchant, Anne-Sophie; Egerer, Monika; Loos, Alexander; Burghardt, Tilo; Zuberbühler, Klaus; Corogenes, Katherine; Leinert, Vera; Kulik, Lars; Kühl, Hjalmar S
2017-03-01
Surveying endangered species is necessary to evaluate conservation effectiveness. Camera trapping and biometric computer vision are recent technological advances. They have impacted on the methods applicable to field surveys and these methods have gained significant momentum over the last decade. Yet, most researchers inspect footage manually and few studies have used automated semantic processing of video trap data from the field. The particular aim of this study is to evaluate methods that incorporate automated face detection technology as an aid to estimate site use of two chimpanzee communities based on camera trapping. As a comparative baseline we employ traditional manual inspection of footage. Our analysis focuses specifically on the basic parameter of occurrence where we assess the performance and practical value of chimpanzee face detection software. We found that the semi-automated data processing required only 2-4% of the time compared to the purely manual analysis. This is a non-negligible increase in efficiency that is critical when assessing the feasibility of camera trap occupancy surveys. Our evaluations suggest that our methodology estimates the proportion of sites used relatively reliably. Chimpanzees are mostly detected when they are present and when videos are filmed in high-resolution: the highest recall rate was 77%, for a false alarm rate of 2.8% for videos containing only chimpanzee frontal face views. Certainly, our study is only a first step for transferring face detection software from the lab into field application. Our results are promising and indicate that the current limitation of detecting chimpanzees in camera trap footage due to lack of suitable face views can be easily overcome on the level of field data collection, that is, by the combined placement of multiple high-resolution cameras facing reverse directions. This will enable to routinely conduct chimpanzee occupancy surveys based on camera trapping and semi-automated processing of footage. Using semi-automated ape face detection technology for processing camera trap footage requires only 2-4% of the time compared to manual analysis and allows to estimate site use by chimpanzees relatively reliably. © 2017 Wiley Periodicals, Inc.
WEAK LENSING MEASUREMENT OF GALAXY CLUSTERS IN THE CFHTLS-WIDE SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shan Huanyuan; Tao Charling; Kneib, Jean-Paul
2012-03-20
We present the first weak gravitational lensing analysis of the completed Canada-France-Hawaii Telescope Legacy Survey (CFHTLS). We study the 64 deg{sup 2} W1 field, the largest of the CFHTLS-Wide survey fields, and present the largest contiguous weak lensing convergence 'mass map' yet made. 2.66 million galaxy shapes are measured, using the Kaiser Squires and Broadhurst Method (KSB) pipeline verified against high-resolution Hubble Space Telescope imaging that covers part of the CFHTLS. Our i'-band measurements are also consistent with an analysis of independent r'-band imaging. The reconstructed lensing convergence map contains 301 peaks with signal-to-noise ratio {nu} > 3.5, consistent withmore » predictions of a {Lambda}CDM model. Of these peaks, 126 lie within 3.'0 of a brightest central galaxy identified from multicolor optical imaging in an independent, red sequence survey. We also identify seven counterparts for massive clusters previously seen in X-ray emission within 6 deg{sup 2} XMM-LSS survey. With photometric redshift estimates for the source galaxies, we use a tomographic lensing method to fit the redshift and mass of each convergence peak. Matching these to the optical observations, we confirm 85 groups/clusters with {chi}{sup 2}{sub reduced} < 3.0, at a mean redshift (z{sub c} ) = 0.36 and velocity dispersion ({sigma}{sub c}) = 658.8 km s{sup -1}. Future surveys, such as DES, LSST, KDUST, and EUCLID, will be able to apply these techniques to map clusters in much larger volumes and thus tightly constrain cosmological models.« less
[Galaxy/quasar classification based on nearest neighbor method].
Li, Xiang-Ru; Lu, Yu; Zhou, Jian-Ming; Wang, Yong-Jun
2011-09-01
With the wide application of high-quality CCD in celestial spectrum imagery and the implementation of many large sky survey programs (e. g., Sloan Digital Sky Survey (SDSS), Two-degree-Field Galaxy Redshift Survey (2dF), Spectroscopic Survey Telescope (SST), Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) program and Large Synoptic Survey Telescope (LSST) program, etc.), celestial observational data are coming into the world like torrential rain. Therefore, to utilize them effectively and fully, research on automated processing methods for celestial data is imperative. In the present work, we investigated how to recognizing galaxies and quasars from spectra based on nearest neighbor method. Galaxies and quasars are extragalactic objects, they are far away from earth, and their spectra are usually contaminated by various noise. Therefore, it is a typical problem to recognize these two types of spectra in automatic spectra classification. Furthermore, the utilized method, nearest neighbor, is one of the most typical, classic, mature algorithms in pattern recognition and data mining, and often is used as a benchmark in developing novel algorithm. For applicability in practice, it is shown that the recognition ratio of nearest neighbor method (NN) is comparable to the best results reported in the literature based on more complicated methods, and the superiority of NN is that this method does not need to be trained, which is useful in incremental learning and parallel computation in mass spectral data processing. In conclusion, the results in this work are helpful for studying galaxies and quasars spectra classification.
Chapter A5. Processing of Water Samples
Wilde, Franceska D.; Radtke, Dean B.; Gibs, Jacob; Iwatsubo, Rick T.
1999-01-01
The National Field Manual for the Collection of Water-Quality Data (National Field Manual) describes protocols and provides guidelines for U.S. Geological Survey (USGS) personnel who collect data used to assess the quality of the Nation's surface-water and ground-water resources. This chapter addresses methods to be used in processing water samples to be analyzed for inorganic and organic chemical substances, including the bottling of composite, pumped, and bailed samples and subsamples; sample filtration; solid-phase extraction for pesticide analyses; sample preservation; and sample handling and shipping. Each chapter of the National Field Manual is published separately and revised periodically. Newly published and revised chapters will be announced on the USGS Home Page on the World Wide Web under 'New Publications of the U.S. Geological Survey.' The URL for this page is http:/ /water.usgs.gov/lookup/get?newpubs.
The U.S. Geological Survey Astrogeology Science Center
Kestay, Laszlo P.; Vaughan, R. Greg; Gaddis, Lisa R.; Herkenhoff, Kenneth E.; Hagerty, Justin J.
2017-07-17
In 1960, Eugene Shoemaker and a small team of other scientists founded the field of astrogeology to develop tools and methods for astronauts studying the geology of the Moon and other planetary bodies. Subsequently, in 1962, the U.S. Geological Survey Branch of Astrogeology was established in Menlo Park, California. In 1963, the Branch moved to Flagstaff, Arizona, to be closer to the young lava flows of the San Francisco Volcanic Field and Meteor Crater, the best preserved impact crater in the world. These geologic features of northern Arizona were considered good analogs for the Moon and other planetary bodies and valuable for geologic studies and astronaut field training. From its Flagstaff campus, the USGS has supported the National Aeronautics and Space Administration (NASA) space program with scientific and cartographic expertise for more than 50 years.
Kimiafar, Khalil; Sarbaz, Masoumeh; Sheikhtaheri, Abbas
2016-01-01
Background: There are no general strategies or tools to evaluate daily lesson plans; however, assessments conducted using traditional methods usually include course plans. This study aimed to evaluate the strengths and weaknesses of online survey software in collecting data on education in medical fields and the application of such softwares to evaluate students' views and modification of lesson plans. Methods: After investigating the available online survey software, esurveypro was selected for assessing daily lesson plans. After using the software for one semester, a questionnaire was prepared to assess the advantages and disadvantages of this method and students' views in a cross-sectional study. Results: The majority of the students (51.7%) rated the evaluation of classes per session (lesson plans) using the online survey as useful or very useful. About 51% (n=36) of the students considered this method effective in improving the management of each session, 67.1% (n=47) considered it effective in improving the management of sessions for the next semester, and 51.4% (n=36) said it had a high impact on improving the educational content of subsequent sessions. Finally, 61.4% (n=43) students expressed high and very high levels of satisfaction with using an online survey at each session. Conclusion: The use of online surveys may be appropriate to improve lesson plans and educational planning at different levels. This method can be used for other evaluations and for assessing people's opinions at different levels of an educational system.
Percolation Analysis of a Wiener Reconstruction of the IRAS 1.2 Jy Redshift Catalog
NASA Astrophysics Data System (ADS)
Yess, Capp; Shandarin, Sergei F.; Fisher, Karl B.
1997-01-01
We present percolation analyses of Wiener reconstructions of the IRAS 1.2 Jy redshift survey. There are 10 reconstructions of galaxy density fields in real space spanning the range β = 0.1-1.0, where β = Ω0.6/b, Ω is the present dimensionless density, and b is the bias factor. Our method uses the growth of the largest cluster statistic to characterize the topology of a density field, where Gaussian randomized versions of the reconstructions are used as standards for analysis. For the reconstruction volume of radius R ~ 100 h-1 Mpc, percolation analysis reveals a slight ``meatball'' topology for the real space, galaxy distribution of the IRAS survey.
Mach 6 flow field surveys beneath the forebody of an airbreathing missile
NASA Technical Reports Server (NTRS)
Johnson, P. J.; Hunt, J. L.
1986-01-01
Wall static, local stream static, and pitot pressure surveys were made on the windward side of a hypersonic airbreathing missile at full-scale length Reynolds numbers. In the inviscid part of the flow field, the experimental massflow ratios agreed with trends predicted by a three-dimensional method-of-characteristics solution. At a longitudinal station 3.5 diameters downstrea of the nose, the boundary layer was transitional or turbulent at zero incidence but became laminar as the angle of attack increased. The bell-shaped distribution of the boundary layer across the width of the body affected the mass flow distribution out to the bow shock and decreased the mass flow available the engine inlet.
Development of the Stress of Immigration Survey (SOIS): a Field Test among Mexican Immigrant Women
Sternberg, Rosa Maria; Nápoles, Anna Maria; Gregorich, Steven; Paul, Steven; Lee, Kathryn A.; Stewart, Anita L.
2016-01-01
The Stress of Immigration Survey (SOIS) is a screening tool used to assess immigration-related stress. The mixed methods approach included concept development, pretesting, field-testing, and psychometric evaluation in a sample of 131 low-income women of Mexican descent. The 21-item SOIS screens for stress related to language; immigrant status; work issues; yearning for family and home country; and cultural dissonance. Mean scores ranged from 3.6 to 4.4 (1-5 scale, higher is more stress). Cronbach's alphas >.80 for all sub-scales. The SOIS may be a useful screening tool for detecting high levels of immigration-related stress in low-income Mexican immigrant women. PMID:26605954
Hello World! - Experiencing Usability Methods without Usability Expertise
NASA Astrophysics Data System (ADS)
Eriksson, Elina; Cajander, Åsa; Gulliksen, Jan
How do you do usability work when no usability expertise is available? What happens in an organization when system developers, with no previous HCI knowledge, after a 3-day course, start applying usability methods, and particularly field studies? In order to answer these questions qualitative data were gathered through participatory observations, a feed back survey, field study documentation and interviews from 47 system developers from a public authority. Our results suggest that field studies enhance the developer’s understanding of the user perspective, and provide a more holistic overview of the use situation, but that some developers were unable to interpret their observations and see solutions to the users’ problems. The field study method was very much appreciated and has now become standard operating procedure within the organization. However, although field studies may be useful, it does not replace the need for usability pro fes sion als, as their knowledge is essential for more complex observations, analysis and for keeping the focus on usability.
NASA Astrophysics Data System (ADS)
Furukawa, T.; Takizawa, K.; Yano, K.; Kuwahara, D.; Shinohara, S.
2018-04-01
A two-dimensional scanning probe instrument has been developed to survey spatial plasma characteristics in our electrodeless plasma acceleration schemes. In particular, diagnostics of plasma parameters, e.g., plasma density, temperature, velocity, and excited magnetic field, are essential for elucidating physical phenomena since we have been concentrating on next generation plasma propulsion methods, e.g., Rotating Magnetic Field plasma acceleration method, by characterizing the plasma performance. Moreover, in order to estimate the thrust performance in our experimental scheme, we have also mounted a thrust stand, which has a target type, on this movable instrument, and scanned the axial profile of the thrust performance in the presence of the external magnetic field generated by using permanent magnets, so as to investigate the plasma captured in a stand area, considering the divergent field lines in the downstream region of a generation antenna. In this paper, we will introduce the novel measurement instrument and describe how to measure these parameters.
Field Tests of the Magnetotelluric Method to Detect Gas Hydrates, Mallik, Mackenzie Delta, Canada
NASA Astrophysics Data System (ADS)
Craven, J. A.; Roberts, B.; Bellefleur, G.; Spratt, J.; Wright, F.; Dallimore, S. R.
2008-12-01
The magnetotelluric method is not generally utilized at extreme latitudes due primarily to difficulties in making the good electrical contact with the ground required to measure the electric field. As such, the magnetotelluric technique has not been previously investigated to direct detect gas hydrates in on-shore permafrost environments. We present the results of preliminary field tests at Mallik, Northwest Territories, Canada, that demonstrate good quality magnetotelluric data can be obtained in this environment using specialized electrodes and buffer amplifiers similar to those utilized by Wannamaker et al (2004). This result suggests that subsurface images from larger magnetotelluric surveys will be useful to complement other techniques to detect, quantify and characterize gas hydrates.
Myatt, Mark; Limburg, Hans; Minassian, Darwin; Katyola, Damson
2003-01-01
OBJECTIVE: To test the applicability of lot quality assurance sampling (LQAS) for the rapid assessment of the prevalence of active trachoma. METHODS: Prevalence of active trachoma in six communities was found by examining all children aged 2-5 years. Trial surveys were conducted in these communities. A sampling plan appropriate for classifying communities with prevalences < or =20% and > or =40% was applied to the survey data. Operating characteristic and average sample number curves were plotted, and screening test indices were calculated. The ability of LQAS to provide a three-class classification system was investigated. FINDINGS: Ninety-six trial surveys were conducted. All communities with prevalences < or =20% and > or =40% were identified correctly. The method discriminated between communities with prevalences < or =30% and >30%, with sensitivity of 98% (95% confidence interval (CI)=88.2-99.9%), specificity of 84.4% (CI=69.9-93.0%), positive predictive value of 87.7% (CI=75.7-94.5%), negative predictive value of 97.4% (CI=84.9-99.9%), and accuracy of 91.7% (CI=83.8-96.1%). Agreement between the three prevalence classes and survey classifications was 84.4% (CI=75.2-90.7%). The time needed to complete the surveys was consistent with the need to complete a survey in one day. CONCLUSION: Lot quality assurance sampling provides a method of classifying communities according to the prevalence of active trachoma. It merits serious consideration as a replacement for the assessment of the prevalence of active trachoma with the currently used trachoma rapid assessment method. It may be extended to provide a multi-class classification method. PMID:14997240
NASA Astrophysics Data System (ADS)
Li, D. Y.; Li, K.; Wu, C.
2017-08-01
With the promotion of fine degree of the heritage building surveying and mapping, building information modelling technology(BIM) begins to be used in surveying and mapping, renovation, recording and research of heritage building, called historical building information modelling(HBIM). The hierarchical frameworks of parametric component library of BIM, belonging to the same type with the same parameters, has the same internal logic with archaeological typology which is more and more popular in the age identification of ancient buildings. Compared with the common materials, 2D drawings and photos, typology with HBIM has two advantages — (1) comprehensive building information both in collection and representation and (2) uniform and reasonable classification criteria This paper will take the information surveying and mapping of Jiayuguan Fortress Town as an example to introduce the field work method of information surveying and mapping based on HBIM technology and the construction of Revit family library.And then in order to prove the feasibility and advantage of HBIM technology used in typology method, this paper will identify the age of Guanghua gate tower, Rouyuan gate tower, Wenchang pavilion and the theater building of Jiayuguan Fortress Town with HBIM technology and typology method.
Surveying vendors of street-vended food: a new methodology applied in two Guatemalan cities.
Mahon, B. E.; Sobel, J.; Townes, J. M.; Mendoza, C.; Gudiel Lemus, M.; Cano, F.; Tauxe, R. V.
1999-01-01
Lack of reliable data about street vendors, who are difficult to survey, has hampered efforts to improve the safety of street-vended food. A two-phase method for sampling vendors, surveying first in areas of concentrated vending activity identified by local authorities and second in randomly selected areas, was developed and implemented in two Guatemalan cities where street-vended food had been implicated in cholera transmission. In a 4-day survey in Escuintla, 59 vendors (42 from phase 1, 17 from phase 2) were interviewed. They demonstrated good knowledge of food safety and cholera but unsafe practices, implying that more effective, practical training was needed. In a 6-day survey in Guatemala City, 78 vendors (77 from phase 1, 1 from phase 2) were interviewed. Sixty-eight (87 %) vendors stored water, usually in wide-mouthed vessels prone to contamination; this led to a field test of a new system for safe water storage. Useful information for public health planning and intervention can be gathered rapidly with this new method for surveying street vendors. PMID:10459643
Two field trials for deblending of simultaneous source surveys: Why we failed and why we succeeded?
NASA Astrophysics Data System (ADS)
Zu, Shaohuan; Zhou, Hui; Chen, Haolin; Zheng, Hao; Chen, Yangkang
2017-08-01
Currently, deblending is the main strategy for dealing with the intense interference problem of simultaneous source data. Most deblending methods are based on the property that useful signal is coherent while the interference is incoherent in some domains other than common shot domain. In this paper, two simultaneous source field trials were studied in detail. In the first trial, the simultaneous source survey was not optimal, as the dithering code had strong coherency and the minimum distance between the two vessels was also small. The chosen marine shot scheduling and vessel deployment made it difficult to deblend the simultaneous source data, and result was an unexpected failure. Next, we tested different parameters (the dithering code and the minimum distance between vessels) of the simultaneous source survey using the simulated blended data and got some useful insights. Then, we carried out the second field trial with a carefully designed survey that was much different from the first trial. The deblended results in common receiver gather, common shot gather or the final stacked profile were encouraging. We obtained a complete success in the second field trial, which gave us confidence in the further test (such as a full three dimensional acquisition test or a high-resolution acquisition test with denser spatial sampling). Remembering that failures with simultaneous sourcing seldom reported, in this paper, our contribution is the discussion in detail about both our failed and successful field experiments and the lessons we have learned from them with the hope that the experience gained from this study can be very useful to other researchers in the same field.
New methods to constrain the radio transient rate: results from a survey of four fields with LOFAR.
Carbone, D; van der Horst, A J; Wijers, R A M J; Swinbank, J D; Rowlinson, A; Broderick, J W; Cendes, Y N; Stewart, A J; Bell, M E; Breton, R P; Corbel, S; Eislöffel, J; Fender, R P; Grießmeier, J-M; Hessels, J W T; Jonker, P; Kramer, M; Law, C J; Miller-Jones, J C A; Pietka, M; Scheers, L H A; Stappers, B W; van Leeuwen, J; Wijnands, R; Wise, M; Zarka, P
2016-07-01
We report on the results of a search for radio transients between 115 and 190 MHz with the LOw-Frequency ARray (LOFAR). Four fields have been monitored with cadences between 15 min and several months. A total of 151 images were obtained, giving a total survey area of 2275 deg 2 . We analysed our data using standard LOFAR tools and searched for radio transients using the LOFAR Transients Pipeline. No credible radio transient candidate has been detected; however, we are able to set upper limits on the surface density of radio transient sources at low radio frequencies. We also show that low-frequency radio surveys are more sensitive to steep-spectrum coherent transient sources than GHz radio surveys. We used two new statistical methods to determine the upper limits on the transient surface density. One is free of assumptions on the flux distribution of the sources, while the other assumes a power-law distribution in flux and sets more stringent constraints on the transient surface density. Both of these methods provide better constraints than the approach used in previous works. The best value for the upper limit we can set for the transient surface density, using the method assuming a power-law flux distribution, is 1.3 × 10 -3 deg -2 for transients brighter than 0.3 Jy with a time-scale of 15 min, at a frequency of 150 MHz. We also calculated for the first time upper limits for the transient surface density for transients of different time-scales. We find that the results can differ by orders of magnitude from previously reported, simplified estimates.
VizieR Online Data Catalog: 12um ISOCAM survey of the ESO-Sculptor field (Seymour+, 2007)
NASA Astrophysics Data System (ADS)
Seymour, N.; Rocca-Volmerange, B.; de Lapparent, V.
2007-11-01
We present a detailed reduction of a mid-infrared 12um (LW10 filter) ISOCAM open time observation performed on the ESO-Sculptor Survey field (Arnouts et al., 1997A&AS..124..163A). A complete catalogue of 142 sources (120 galaxies and 22 stars), detected with high significance (equivalent to 5{sigma}), is presented above an integrated flux density of 0.31mJy. Star/galaxy separation is performed by a detailed study of colour-colour diagrams. The catalogue is complete to 1mJy and, below this flux density, the incompleteness is corrected using two independent methods. The first method uses stars and the second uses optical counterparts of the ISOCAM galaxies; these methods yield consistent results. We also apply an empirical flux density calibration using stars in the field. For each star, the 12um flux density is derived by fitting optical colours from a multi-band {chi}2 to stellar templates (BaSel-2.0) and using empirical optical-IR colour-colour relations. This article is a companion analysis to our 2007 paper (Rocca-Volmerange et al. 2007A&A...475..801R) where the 12um faint galaxy counts are presented and analysed per galaxy type with the evolutionary code PEGASE.3. (1 data file).
Bioinspired Intelligent Algorithm and Its Applications for Mobile Robot Control: A Survey.
Ni, Jianjun; Wu, Liuying; Fan, Xinnan; Yang, Simon X
2016-01-01
Bioinspired intelligent algorithm (BIA) is a kind of intelligent computing method, which is with a more lifelike biological working mechanism than other types. BIAs have made significant progress in both understanding of the neuroscience and biological systems and applying to various fields. Mobile robot control is one of the main application fields of BIAs which has attracted more and more attention, because mobile robots can be used widely and general artificial intelligent algorithms meet a development bottleneck in this field, such as complex computing and the dependence on high-precision sensors. This paper presents a survey of recent research in BIAs, which focuses on the research in the realization of various BIAs based on different working mechanisms and the applications for mobile robot control, to help in understanding BIAs comprehensively and clearly. The survey has four primary parts: a classification of BIAs from the biomimetic mechanism, a summary of several typical BIAs from different levels, an overview of current applications of BIAs in mobile robot control, and a description of some possible future directions for research.
Fisher, Michael B.; Mann, Benjamin H.; Cronk, Ryan D.; Shields, Katherine F.; Klug, Tori L.; Ramaswamy, Rohit
2016-01-01
Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs. PMID:27563916
Bioinspired Intelligent Algorithm and Its Applications for Mobile Robot Control: A Survey
Ni, Jianjun; Wu, Liuying; Fan, Xinnan; Yang, Simon X.
2016-01-01
Bioinspired intelligent algorithm (BIA) is a kind of intelligent computing method, which is with a more lifelike biological working mechanism than other types. BIAs have made significant progress in both understanding of the neuroscience and biological systems and applying to various fields. Mobile robot control is one of the main application fields of BIAs which has attracted more and more attention, because mobile robots can be used widely and general artificial intelligent algorithms meet a development bottleneck in this field, such as complex computing and the dependence on high-precision sensors. This paper presents a survey of recent research in BIAs, which focuses on the research in the realization of various BIAs based on different working mechanisms and the applications for mobile robot control, to help in understanding BIAs comprehensively and clearly. The survey has four primary parts: a classification of BIAs from the biomimetic mechanism, a summary of several typical BIAs from different levels, an overview of current applications of BIAs in mobile robot control, and a description of some possible future directions for research. PMID:26819582
Fisher, Michael B; Mann, Benjamin H; Cronk, Ryan D; Shields, Katherine F; Klug, Tori L; Ramaswamy, Rohit
2016-08-23
Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs.
NASA Astrophysics Data System (ADS)
Gålfalk, Magnus; Karlson, Martin; Crill, Patrick; Bastviken, David
2017-04-01
The calibration and validation of remote sensing land cover products is highly dependent on accurate ground truth data, which are costly and practically challenging to collect. This study evaluates a novel and efficient alternative to field surveys and UAV imaging commonly applied for this task. The method consists of i) a light weight, water proof, remote controlled RGB-camera mounted on an extendable monopod used for acquiring wide-field images of the ground from a height of 4.5 meters, and ii) a script for semi-automatic image classification. In the post-processing, the wide-field images are corrected for optical distortion and geometrically rectified so that the spatial resolution is the same over the surface area used for classification. The script distinguishes land surface components by color, brightness and spatial variability. The method was evaluated in wetland areas located around Abisko, northern Sweden. Proportional estimates of the six main surface components in the wetlands (wet and dry Sphagnum, shrub, grass, water, rock) were derived for 200 images, equivalent to 10 × 10 m field plots. These photo plots were then used as calibration data for a regional scale satellite based classification which separates the six wetland surface components using a Sentinel-1 time series. The method presented in this study is accurate, rapid, robust and cost efficient in comparison to field surveys (time consuming) and drone mapping (which require low wind speeds and no rain, suffer from battery limited flight times, have potential GPS/compass errors far north, and in some areas are prohibited by law).
NASA Astrophysics Data System (ADS)
Callahan, R. P.; Taylor, N. J.; Pasquet, S.; Dueker, K. G.; Riebe, C. S.; Holbrook, W. S.
2016-12-01
Geophysical imaging is rapidly becoming popular for quantifying subsurface critical zone (CZ) architecture. However, a diverse array of measurements and measurement techniques are available, raising the question of which are appropriate for specific study goals. Here we compare two techniques for measuring S-wave velocities (Vs) in the near surface. The first approach quantifies Vs in three dimensions using a passive source and an iterative residual least-squares tomographic inversion. The second approach uses a more traditional active-source seismic survey to quantify Vs in two dimensions via a Monte Carlo surface-wave dispersion inversion. Our analysis focuses on three 0.01 km2 study plots on weathered granitic bedrock in the Southern Sierra Critical Zone Observatory. Preliminary results indicate that depth-averaged velocities from the two methods agree over the scales of resolution of the techniques. While the passive- and active-source techniques both quantify Vs, each method has distinct advantages and disadvantages during data acquisition and analysis. The passive-source method has the advantage of generating a three dimensional distribution of subsurface Vs structure across a broad area. Because this method relies on the ambient seismic field as a source, which varies unpredictably across space and time, data quality and depth of investigation are outside the control of the user. Meanwhile, traditional active-source surveys can be designed around a desired depth of investigation. However, they only generate a two dimensional image of Vs structure. Whereas traditional active-source surveys can be inverted quickly on a personal computer in the field, passive source surveys require significantly more computations, and are best conducted in a high-performance computing environment. We use data from our study sites to compare these methods across different scales and to explore how these methods can be used to better understand subsurface CZ architecture.
Wampler, Peter J; Rediske, Richard R; Molla, Azizur R
2013-01-18
A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.
Field 1: A First Look at the KELT RR Lyrae Project
NASA Astrophysics Data System (ADS)
De Lee, Nathan M.; Kinemuchi, Karen; Pepper, Joshua; Rodriguez, Joseph E.; Paegert, Martin
2015-01-01
In this poster we will discuss our ongoing program to use extant light curves from the Kilodegree Extremely Little Telescope (KELT) survey to find and characterize RR Lyrae (RRL) stars in the disk and inner halo of the Milky Way. We will focus on initial results from our testbed region, Field 1. RRL stars are of particular interest because they are standard candles and can be used to map out structure in the galaxy. The periods and shape of RRL light curves also contain information about their Oosterhoff type, which can probe galactic formation history, and metallicity respectively. Although there have been several large photometric surveys for RR Lyrae in the nearby galaxy (OGLE, NSVS, ASAS, and MACHO to name a few), they have each been limited in either sky coverage or number of epochs. The KELT survey represents a new generation of surveys that has many epochs over a large portion of the sky. KELT samples 60% of the sky in both northern and southern hemispheres, and has a long-time-baseline of 4-8 years with a very high cadence rate of less than 20 minutes. This translates into 4,000 to 9,000 epochs per light curve with completeness out to 3 kpc from the Sun.Recent results from both Kepler and ground based surveys results suggest that as many as 50% of RR Lyrae stars show long-term modulation of their light curve shapes (Blazhko effect). These stars combined with RRL stars that pulsate in more than one mode give a sample of objects that the KELT survey is uniquely suited to explore. This poster uses the RR Lyrae stars in Field 1 of the KELT survey to compare detection methods to previous variable star surveys of the same region. We also discuss the individual RR Lyrae found in Field 1. In particular, we focus on initial characterization of RRL light curves including those with amplitude-modulated or period-modulated light curves. We uses these initial results to discuss future plans for this survey.
A field test of three LQAS designs to assess the prevalence of acute malnutrition.
Deitchler, Megan; Valadez, Joseph J; Egge, Kari; Fernandez, Soledad; Hennigan, Mary
2007-08-01
The conventional method for assessing the prevalence of Global Acute Malnutrition (GAM) in emergency settings is the 30 x 30 cluster-survey. This study describes alternative approaches: three Lot Quality Assurance Sampling (LQAS) designs to assess GAM. The LQAS designs were field-tested and their results compared with those from a 30 x 30 cluster-survey. Computer simulations confirmed that small clusters instead of a simple random sample could be used for LQAS assessments of GAM. Three LQAS designs were developed (33 x 6, 67 x 3, Sequential design) to assess GAM thresholds of 10, 15 and 20%. The designs were field-tested simultaneously with a 30 x 30 cluster-survey in Siraro, Ethiopia during June 2003. Using a nested study design, anthropometric, morbidity and vaccination data were collected on all children 6-59 months in sampled households. Hypothesis tests about GAM thresholds were conducted for each LQAS design. Point estimates were obtained for the 30 x 30 cluster-survey and the 33 x 6 and 67 x 3 LQAS designs. Hypothesis tests showed GAM as <10% for the 33 x 6 design and GAM as > or =10% for the 67 x 3 and Sequential designs. Point estimates for the 33 x 6 and 67 x 3 designs were similar to those of the 30 x 30 cluster-survey for GAM (6.7%, CI = 3.2-10.2%; 8.2%, CI = 4.3-12.1%, 7.4%, CI = 4.8-9.9%) and all other indicators. The CIs for the LQAS designs were only slightly wider than the CIs for the 30 x 30 cluster-survey; yet the LQAS designs required substantially less time to administer. The LQAS designs provide statistically appropriate alternatives to the more time-consuming 30 x 30 cluster-survey. However, additional field-testing is needed using independent samples rather than a nested study design.
Variability-selected active galactic nuclei in the VST-SUDARE/VOICE survey of the COSMOS field
NASA Astrophysics Data System (ADS)
De Cicco, D.; Paolillo, M.; Covone, G.; Falocco, S.; Longo, G.; Grado, A.; Limatola, L.; Botticella, M. T.; Pignata, G.; Cappellaro, E.; Vaccari, M.; Trevese, D.; Vagnetti, F.; Salvato, M.; Radovich, M.; Brandt, W. N.; Capaccioli, M.; Napolitano, N. R.; Schipani, P.
2015-02-01
Context. Active galaxies are characterized by variability at every wavelength, with timescales from hours to years depending on the observing window. Optical variability has proven to be an effective way of detecting AGNs in imaging surveys, lasting from weeks to years. Aims: In the present work we test the use of optical variability as a tool to identify active galactic nuclei in the VST multiepoch survey of the COSMOS field, originally tailored to detect supernova events. Methods: We make use of the multiwavelength data provided by other COSMOS surveys to discuss the reliability of the method and the nature of our AGN candidates. Results: The selection on the basis of optical variability returns a sample of 83 AGN candidates; based on a number of diagnostics, we conclude that 67 of them are confirmed AGNs (81% purity), 12 are classified as supernovae, while the nature of the remaining 4 is unknown. For the subsample of AGNs with some spectroscopic classification, we find that Type 1 are prevalent (89%) compared to Type 2 AGNs (11%). Overall, our approach is able to retrieve on average 15% of all AGNs in the field identified by means of spectroscopic or X-ray classification, with a strong dependence on the source apparent magnitude (completeness ranging from 26% to 5%). In particular, the completeness for Type 1 AGNs is 25%, while it drops to 6% for Type 2 AGNs. The rest of the X-ray selected AGN population presents on average a larger rms variability than the bulk of non-variable sources, indicating that variability detection for at least some of these objects is prevented only by the photometric accuracy of the data. The low completeness is in part due to the short observing span: we show that increasing the temporal baseline results in larger samples as expected for sources with a red-noise power spectrum. Our results allow us to assess the usefulness of this AGN selection technique in view of future wide-field surveys. Observations were provided by the ESO programs 088.D-0370 and 088.D-4013 (PI G. Pignata).Table 3 is available in electronic form at http://www.aanda.org
1980-03-01
3 Expectations ........ .................... 7 ~8 RECONNAISSANCE SURVEY .................. Field Methods...Construction. . . 25 7. Remains of a Japanese Tugboat (Object 1) Aground Near the Small Boat Harbor Entrance .. ......... ... 26 8. The Remains of a Possible...acted as field assistant for the project. iT ’S vi .r ABSTRACT Members of the Pacific Studies Institute and Maritime Historian, Mr. Ronald Strong, of
Joint 3D Inversion of ZTEM Airborne and Ground MT Data with Application to Geothermal Exploration
NASA Astrophysics Data System (ADS)
Wannamaker, P. E.; Maris, V.; Kordy, M. A.
2017-12-01
ZTEM is an airborne electromagnetic (EM) geophysical technique developed by Geotech Inc® where naturally propagated EM fields originating with regional and global lightning discharges (sferics) are measured as a means of inferring subsurface electrical resistivity structure. A helicopter-borne coil platform (bird) measuring the vertical component of magnetic (H) field variations along a flown profile is referenced to a pair of horizontal coils at a fixed location on the ground in order to estimate a tensor H-field transfer function. The ZTEM method is distinct from the traditional magnetotelluric (MT) method in that the electric (E) fields are not considered because of the technological challenge of measuring E-fields in the dielectric air medium. This can lend some non-uniqueness to ZTEM interpretation because a range of conductivity structures in the earth depending upon an assumed background earth resistivity model can fit ZTEM data to within tolerance. MT data do not suffer this particular problem, but they are cumbersome to acquire in their common need for land-based transport often in near-roadless areas and for laying out and digging the electrodes and H coils. The complementary nature of ZTEM and MT logistics and resolution has motivated development of schemes to acquire appropriate amounts of each data type in a single survey and to produce an earth image through joint inversion. In particular, consideration is given to surveys where only sparse MT soundings are needed to drastically reduce the non-uniqueness associated with background uncertainty while straining logistics minimally. Synthetic and field data are analysed using 2D and 3D finite element platforms developed for this purpose. Results to date suggest that indeed dense ZTEM surveys can provide detailed heterogeneous model images with large-scale averages constrained by a modest number of MT soundings. Further research is needed in determining the allowable degree of MT sparseness and the relative weighting of the two data sets in joint inversion.
PGMS: A Case Study of Collecting PDA-Based Geo-Tagged Malaria-Related Survey Data
Zhou, Ying; Lobo, Neil F.; Wolkon, Adam; Gimnig, John E.; Malishee, Alpha; Stevenson, Jennifer; Sulistyawati; Collins, Frank H.; Madey, Greg
2014-01-01
Using mobile devices, such as personal digital assistants (PDAs), smartphones, tablet computers, etc., to electronically collect malaria-related field data is the way for the field questionnaires in the future. This case study seeks to design a generic survey framework PDA-based geo-tagged malaria-related data collection tool (PGMS) that can be used not only for large-scale community-level geo-tagged electronic malaria-related surveys, but also for a wide variety of electronic data collections of other infectious diseases. The framework includes two parts: the database designed for subsequent cross-sectional data analysis and the customized programs for the six study sites (two in Kenya, three in Indonesia, and one in Tanzania). In addition to the framework development, we also present our methods used when configuring and deploying the PDAs to 1) reduce data entry errors, 2) conserve battery power, 3) field install the programs onto dozens of handheld devices, 4) translate electronic questionnaires into local languages, 5) prevent data loss, and 6) transfer data from PDAs to computers for future analysis and storage. Since 2008, PGMS has successfully accomplished quite a few surveys that recorded 10,871 compounds and households, 52,126 persons, and 17,100 bed nets from the six sites. These numbers are still growing. PMID:25048377
Methods for collection and analysis of aquatic biological and microbiological samples
Britton, L.J.; Greeson, P.E.
1989-01-01
The series of chapters on techniques describes methods used by the U.S. Geological Survey for planning and conducting water-resources investigations. The material is arranged under major subject headings called books and is further subdivided into sections and chapters. Book 5 is on laboratory analysis. Section A is on water. The unit of publication, the chapter, is limited to a narrow field of subject matter. "Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" is the fourth chapter to be published under Section A of Book 5. The chapter number includes the letter of the section.This chapter was prepared by several aquatic biologists and microbiologists of the U.S. Geological Survey to provide accurate and precise methods for the collection and analysis of aquatic biological and microbiological samples.Use of brand, firm, and trade names in this chapter is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey.This chapter supersedes "Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" edited by P.E. Greeson, T.A. Ehlke, G.A. Irwin, B.W. Lium, and K.V. Slack (U.S. Geological Survey Techniques of Water-Resources Investigations, Book 5, Chapter A4, 1977) and also supersedes "A Supplement to-Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" by P.E. Greeson (U.S. Geological Survey Techniques of Water-Resources Investigations, Book 5, Chapter A4), Open-File Report 79-1279, 1979.
VizieR Online Data Catalog: Clusters of galaxies in SDSS-III (Wen+, 2012)
NASA Astrophysics Data System (ADS)
Wen, Z. L.; Han, J. L.; Liu, F. S.
2012-06-01
Wen et al. (2009, Cat. J/ApJS/183/197) identified 39668 galaxy clusters from the SDSS DR6 by the discrimination of member galaxies of clusters using photometric redshifts of galaxies. Wen & Han (2011ApJ...734...68W) improved the method and successfully identified the high-redshift clusters from the deep fields of the Canada-France-Hawaii Telescope (CFHT) Wide survey, the CHFT Deep survey, the Cosmic Evolution Survey, and the Spitzer Wide-area InfraRed Extragalactic survey. Here, we follow and improve the algorithm to identify clusters from SDSS-III (SDSS Data Release 8; Aihara et al. 2011ApJS..193...29A, see Cat. II/306). (1 data file).
A survey of medical image registration - under review.
Viergever, Max A; Maintz, J B Antoine; Klein, Stefan; Murphy, Keelin; Staring, Marius; Pluim, Josien P W
2016-10-01
A retrospective view on the past two decades of the field of medical image registration is presented, guided by the article "A survey of medical image registration" (Maintz and Viergever, 1998). It shows that the classification of the field introduced in that article is still usable, although some modifications to do justice to advances in the field would be due. The main changes over the last twenty years are the shift from extrinsic to intrinsic registration, the primacy of intensity-based registration, the breakthrough of nonlinear registration, the progress of inter-subject registration, and the availability of generic image registration software packages. Two problems that were called urgent already 20 years ago, are even more urgent nowadays: Validation of registration methods, and translation of results of image registration research to clinical practice. It may be concluded that the field of medical image registration has evolved, but still is in need of further development in various aspects. Copyright © 2016 Elsevier B.V. All rights reserved.
Photometric redshifts for the CFHTLS T0004 deep and wide fields
NASA Astrophysics Data System (ADS)
Coupon, J.; Ilbert, O.; Kilbinger, M.; McCracken, H. J.; Mellier, Y.; Arnouts, S.; Bertin, E.; Hudelot, P.; Schultheis, M.; Le Fèvre, O.; Le Brun, V.; Guzzo, L.; Bardelli, S.; Zucca, E.; Bolzonella, M.; Garilli, B.; Zamorani, G.; Zanichelli, A.; Tresse, L.; Aussel, H.
2009-06-01
Aims: We compute photometric redshifts in the fourth public release of the Canada-France-Hawaii Telescope Legacy Survey. This unique multi-colour catalogue comprises u^*, g', r', i', z' photometry in four deep fields of 1 deg2 each and 35 deg2 distributed over three wide fields. Methods: We used a template-fitting method to compute photometric redshifts calibrated with a large catalogue of 16 983 high-quality spectroscopic redshifts from the VVDS-F02, VVDS-F22, DEEP2, and the zCOSMOS surveys. The method includes correction of systematic offsets, template adaptation, and the use of priors. We also separated stars from galaxies using both size and colour information. Results: Comparing with galaxy spectroscopic redshifts, we find a photometric redshift dispersion, σΔ z/(1+z_s), of 0.028-0.30 and an outlier rate, |Δ z| ≥ 0.15× (1+z_s), of 3-4% in the deep field at i'_AB < 24. In the wide fields, we find a dispersion of 0.037-0.039 and an outlier rate of 3-4% at i'_AB < 22.5. Beyond i'_AB = 22.5 in the wide fields the number of outliers rises from 5% to 10% at i'_AB < 23 and i'_AB < 24, respectively. For the wide sample the systematic redshift bias stays below 1% to i'_AB < 22.5, whereas we find no significant bias in the deep fields. We investigated the effect of tile-to-tile photometric variations and demonstrated that the accuracy of our photometric redshifts is reduced by at most 21%. Application of our star-galaxy classifier reduced the contamination by stars in our catalogues from 60% to 8% at i'_AB < 22.5 in our field with the highest stellar density while keeping a complete galaxy sample. Our CFHTLS T0004 photometric redshifts are distributed to the community. Our release includes 592891 (i'_AB < 22.5) and 244701 (i'_AB < 24) reliable galaxy photometric redshifts in the wide and deep fields, respectively. Based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT) which is operated by the National Research Council (NRC) of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii. This work is based in part on data products produced at Terapix and the Canadian Astronomy Data Centre as part of the Canada-France-Hawaii Telescope Legacy Survey, a collaborative project of NRC and CNRS.
Identifying OH Imposters in the ALFALFA Neutral Hydrogen Survey
NASA Astrophysics Data System (ADS)
Suess, Katherine A.; Darling, Jeremy; Haynes, Martha P.; Giovanelli, Riccardo
2016-06-01
OH megamasers (OHMs) are rare, luminous molecular masers that are typically observed in (ultra) luminous infrared galaxies and serve as markers of major galaxy mergers. In blind emission line surveys such as the Arecibo Legacy Fast Arecibo L-Band Feed Array (ALFALFA) survey for neutral hydrogen (H I), OHMs at z ˜ 0.2 can mimic z ˜ 0.05 H I lines. We present the results of optical spectroscopy of ambiguous H I detections in the ALFALFA 40 per cent data release detected by the Wide Field Infrared Survey Explorer (WISE) but with uncertain optical counterparts. The optical redshifts, obtained from observations at the Apache Point Observatory, revealed five new OHMs and identified 129 H I optical counterparts. 60 candidates remain ambiguous. The new OHMs are the first detected in a blind spectral line survey. The number of OHMs in ALFALFA is consistent with predictions from the OH luminosity function. Additionally, the mid-infrared magnitudes and colours of the OHM host galaxies found in a blind survey do not seem to differ from those found in previous targeted surveys. This validates the methods used in previous IR-selected OHM surveys and indicates there is no previously unknown OHM-producing population at z ˜ 0.2. We also provide a method for future surveys to separate OH megamasers from 99 per cent of H I line emitters without optical spectroscopy by using WISE infrared colours and magnitudes. Since the fraction of OHMs found in flux-limited H I surveys is expected to increase with the survey's redshift, this selection method can be applied to future flux-limited high-redshift hydrogen surveys.
Buu, Anne; Johnson, Norman J.; Li, Runze; Tan, Xianming
2011-01-01
Zero-inflated count data are very common in health surveys. This study develops new variable selection methods for the zero-inflated Poisson regression model. Our simulations demonstrate the negative consequences which arise from the ignorance of zero-inflation. Among the competing methods, the one-step SCAD method is recommended because it has the highest specificity, sensitivity, exact fit, and lowest estimation error. The design of the simulations is based on the special features of two large national databases commonly used in the alcoholism and substance abuse field so that our findings can be easily generalized to the real settings. Applications of the methodology are demonstrated by empirical analyses on the data from a well-known alcohol study. PMID:21563207
Use of GPR Surveys in Historical Archaeology Studies at Gainesville, Mississippi (22HA600)
NASA Technical Reports Server (NTRS)
Goodwin, Ben; Giardino, Marco; Spruce, Joseph P.
2002-01-01
Ground Penetrating Radar (GPR) is used to study the underground remains of historic structures on the grounds of Stennis Space Center (SSC) in this viewgraph presentation. The main goal of the project described is to research, develop, and validate Remote Sensing (RS) and Geographic Information System (GIS) methods for aiding cultural resource assessments within SSC. The project georeferences historic imagery and maps to assist archaeological RS, field surveys, and excavations.
NASA Astrophysics Data System (ADS)
Butler, D. K.
1982-03-01
This report reviews the scope of a research effort initiated in 1974 at the U.S. Army Engineer Waterways Experiment Station with the objectives of (a) assessing the state of the art in geophysical cavity detection and delineation methodology and (b) developing new methods and improving or adapting old methods for application to cavity detection and delineation. Two field test sites were selected: (a) the Medford Cave site with a relatively shallow (10- to 50-ft-deep) air-filled cavity system and (b) the Manatee Springs site with a deeper (approximately 100-ft-deep) water-filled cavity system. Results of field studies at the Medford Cave site are presented in this report: (a) the site geology, (b) the site topographic survey, (c) the site drilling program (boreholes for geophysical tests, for determination of a detailed geological cross section, and for verification of geophysical anomalies), (d) details of magnetic and microgravimetric surveys, and (e) correlation of geophysical results with known site geology. Qualitative interpretation guidelines using complementary geophysical techniques for site investigations in karst regions are presented. Including the results of electrical resistivity surveys conducted at the Medford Cave site, the qualitative guidelines are applied to four profile lines, and drilling locations are indicated on the profile plots of gravity, magnetic, and electrical resistivity data. Borehole logs are then presented for comparison with the predictions of the qualitative interpretation guidelines.
Xu, Y.; Xia, J.; Miller, R.D.
2006-01-01
Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.
Optimum structural design with plate bending elements - A survey
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Prasad, B.
1981-01-01
A survey is presented of recently published papers in the field of optimum structural design of plates, largely with respect to the minimum-weight design of plates subject to such constraints as fundamental frequency maximization. It is shown that, due to the availability of powerful computers, the trend in optimum plate design is away from methods tailored to specific geometry and loads and toward methods that can be easily programmed for any kind of plate, such as finite element methods. A corresponding shift is seen in optimization from variational techniques to numerical optimization algorithms. Among the topics covered are fully stressed design and optimality criteria, mathematical programming, smooth and ribbed designs, design against plastic collapse, buckling constraints, and vibration constraints.
Methods for solving reasoning problems in abstract argumentation – A survey
Charwat, Günther; Dvořák, Wolfgang; Gaggl, Sarah A.; Wallner, Johannes P.; Woltran, Stefan
2015-01-01
Within the last decade, abstract argumentation has emerged as a central field in Artificial Intelligence. Besides providing a core formalism for many advanced argumentation systems, abstract argumentation has also served to capture several non-monotonic logics and other AI related principles. Although the idea of abstract argumentation is appealingly simple, several reasoning problems in this formalism exhibit high computational complexity. This calls for advanced techniques when it comes to implementation issues, a challenge which has been recently faced from different angles. In this survey, we give an overview on different methods for solving reasoning problems in abstract argumentation and compare their particular features. Moreover, we highlight available state-of-the-art systems for abstract argumentation, which put these methods to practice. PMID:25737590
NASA Astrophysics Data System (ADS)
Wang, Huiyuan; Mo, H. J.; Yang, Xiaohu; Zhang, Youcai; Shi, JingJing; Jing, Y. P.; Liu, Chengze; Li, Shijie; Kang, Xi; Gao, Yang
2016-11-01
A method we developed recently for the reconstruction of the initial density field in the nearby universe is applied to the Sloan Digital Sky Survey Data Release 7. A high-resolution N-body constrained simulation (CS) of the reconstructed initial conditions, with 30723 particles evolved in a 500 {h}-1 {Mpc} box, is carried out and analyzed in terms of the statistical properties of the final density field and its relation with the distribution of Sloan Digital Sky Survey galaxies. We find that the statistical properties of the cosmic web and the halo populations are accurately reproduced in the CS. The galaxy density field is strongly correlated with the CS density field, with a bias that depends on both galaxy luminosity and color. Our further investigations show that the CS provides robust quantities describing the environments within which the observed galaxies and galaxy systems reside. Cosmic variance is greatly reduced in the CS so that the statistical uncertainties can be controlled effectively, even for samples of small volumes.
US army land condition-trend analysis (LCTA) program
NASA Astrophysics Data System (ADS)
Diersing, Victor E.; Shaw, Robert B.; Tazik, David J.
1992-05-01
The US Army Land Condition-Trend Analysis (LCTA) program is a standardized method of data collection, analysis, and reporting designed to meet multiple goals and objectives. The method utilizes vascular plant inventories, permanent field plot data, and wildlife inventories. Vascular plant inventories are used for environmental documentation, training of personnel, species identification during LCTA implementation, and as a survey for state and federal endangered or threatened species. The permanent field plot data documents the vegetational, edaphic, topographic, and disturbance characteristics of the installation. Inventory plots are allocated in a stratified random fashion across the installation utilizing a geographic information system that integrates satellite imagery and soil survey information. Ground cover, canopy cover, woody plant density, slope length, slope gradient, soil information, and disturbance data are collected at each plot. Plot data are used to: (1) describe plant communities, (2) characterize wildlife and threatened and endangered species habitat, (3) document amount and kind of military and nonmilitary disturbance, (4) determine the impact of military training on vegetation and soil resources, (5) estimate soil erosion potential, (6) classify land as to the kind and amount of use it can support, (7) determine allowable use estimates for tracked vehicle training, (8) document concealment resources, (9) identify lands that require restoration and evaluate the effectiveness of restorative techniques, and (10) evaluate potential acquisition property. Wildlife inventories survey small and midsize mammals, birds, bats, amphibians, and reptiles. Data from these surveys can be used for environmental documentation, to identify state and federal endangered and threatened species, and to evaluate the impact of military activities on wildlife populations. Short- and long-term monitoring of permanent field plots is used to evaluate and adjust land management decisions.
A faint field-galaxy redshift survey in quasar fields
NASA Technical Reports Server (NTRS)
Yee, Howard K. C.; Ellingson, Erica
1993-01-01
Quasars serve as excellent markers for the identification of high-redshift galaxies and galaxy clusters. In past surveys, nearly 20 clusters of Abell richness class 1 or richer associated with quasars in the redshift range 0.2 less than z less than 0.8 were identified. In order to study these galaxy clusters in detail, a major redshift survey of faint galaxies in these fields using the CFHT LAMA/MARLIN multi-object spectroscopy system was carried out. An equally important product in such a survey is the redshifts of the field galaxies not associated with the quasars. Some preliminary results on field galaxies from an interim set of data from our redshift survey in quasar fields are presented.
Exploiting Electric and Magnetic Fields for Underwater Characterization
2011-03-01
geophysical surveys are primarily limited to passive magnetic systems towed from a surface vessel. These systems utilize fluxgate , Overhauser, or atomic... magnetometer sensors, often deployed in arrays towed from the stern of small to moderate-size vessels. Active source electromagnetic methods have been
KSWAGS: A Swift X-Ray and UV Survey of the Kepler Field 1
NASA Technical Reports Server (NTRS)
Smith, Krista Lynne; Boyd, Patricia T.; Mushotzky, Richard F.; Gehrels, Neil; Edelson, Rick; Howell, Steve B.; Gelino, Dawn M.; Brown, Alexander; Young, Steve
2015-01-01
We introduce the first phase of the Kepler-Swift Active Galaxies and Stars survey (KSwAGS), a simultaneous X-ray and UV survey of approximately 6 square degrees of the Kepler field using the Swift XRT and UVOT. We detect 93 unique X-ray sources with S/N greater or equal to 3 with the XRT, of which 60 have UV counterparts. We use the Kepler Input Catalog (KIC) to obtain the optical counterparts of these sources, and construct the fX / fV ratio as a first approximation of the classification of the source. The survey produces a mixture of stellar sources, extragalactic sources, and sources which we are not able to classify with certainty. We have obtained optical spectra for thirty of these targets, and are conducting an ongoing observing campaign to fully identify the sample. For sources classified as stellar or AGN with certainty, we construct SEDs using the 2MASS, UBV and GALEX data supplied for their optical counterparts by the KIC, and show that the SEDs differ qualitatively between the source types, and so can offer a method of classification in absence of a spectrum. Future papers in this series will analyze the timing properties of the stars and AGN in our sample separately. Our survey provides the first X-ray and UV data for a number of known variable stellar sources, as well as a large number of new X-ray detections in this well-studied portion of the sky. The KSwAGS survey is currently ongoing in the K2 ecliptic plane fields.
Are wildlife detector dogs or people better at finding Desert Tortoises (Gopherus agassizii)?
Nussear, K.E.; Esque, T.C.; Heaton, J.S.; Cablk, Mary E.; Drake, K.K.; Valentin, C.; Yee, J.L.; Medica, P.A.
2008-01-01
Our ability to study threatened and endangered species depends on locating them readily in the field. Recent studies highlight the effectiveness of trained detector dogs to locate wildlife during field surveys, including Desert Tortoises in a semi-natural setting. Desert Tortoises (Gopherus agassizii) are cryptic and difficult to detect during surveys, especially the smaller size classes. We conducted comparative surveys to determine whether human or detector dog teams were more effective at locating Desert Tortoises in the wild. We compared detectability of Desert Tortoises and the costs to deploy human and dog search teams. Detectability of tortoises was not statistically different for either team, and was estimated to be approximately 70% (SE = 5%). Dogs found a greater proportion of tortoises located in vegetation than did humans. The dog teams finished surveys 2.5 hours faster than the humans on average each day. The human team cost was approximately $3,000 less per square kilometer sampled. Dog teams provided a quick and effective method for surveying for adult Desert Tortoises; however, we were unable to determine-their effectiveness at locating smaller size classes. Detection of smaller size classes during surveys would improve management of the species and should be addressed by future research using Desert Tortoise detector dogs.
NASA Astrophysics Data System (ADS)
Piro, Salvatore; Ceraudo, Giuseppe; Zamuner, Daniela
2010-05-01
To enhance the knowledge finalised to the location and conservation of the unknown buried structures below the actual studied levels, in the territory of the Ancient Aquinum (Frosinone, Italy) a scientific collaboration, inside the "Ager Aquinas Project" between the University of Salento (Department of Cultural Heritage - Laboratory of Ancient Topography and Photogrammetry) and the Institute of Technologies Applied to Cultural Heritage (ITABC-C.N.R.) has been developed, during 2008-2009 and it is still in progress. The site which is the subject of this paper had been identified in the past through air photo interpretation of vertical historical coverage and field - walking surveys. Ancient Aquinum is characterised by two main aspects: the first depends by the presence of a very big defence-system with mighty walls and large ditch; the second characteristic is the presence or regular but not orthogonal road - system of the town, bordered by an unusual parallelogram shape of the blocks. With the results obtained after the elaborations of the first aerial data sets and field surveys, has been possible to map the main town - planning, drawing the main road system inside and outside the town. Although the analysis of the air photo evidence allowed the global interpretation of the site, it was not possible to reconstruct the archaeological evidences in the central portion of the town. Therefore the Project, during 2008, started with new acquisition and elaboration of aerial photos, field-walking surveys and GPR surveys with the aim to better define the urban plan of the central portion of the ancient town. The location, depth, and size of the buried buildings were effectively estimated from non-destructive remote sensing with a gradiometric and ground-penetrating radar systems. Recent archaeological excavations made (by Prof. Giuseppe Ceraudo - University of Salento, Lecce) during the summer 2009, have confirmed the structures individuated with the geophysical methods. This project is still in progress and new surveys, employing integrated geophysical methods, are planned for the next year.
NASA Astrophysics Data System (ADS)
Yano, S.; Kondo, H.; Tawara, Y.; Yamada, T.; Mori, K.; Yoshida, A.; Tada, K.; Tsujimura, M.; Tokunaga, T.
2017-12-01
It is important to understand groundwater systems, including their recharge, flow, storage, discharge, and withdrawal, so that we can use groundwater resources efficiently and sustainably. To examine groundwater recharge, several methods have been discussed based on water balance estimation, in situ experiments, and hydrological tracers. However, few studies have developed a concrete framework for quantifying groundwater recharge rates in an undefined area. In this study, we established a robust method to quantitatively determine water cycles and estimate the groundwater recharge rate by combining the advantages of field surveys and model simulations. We replicated in situ hydrogeological observations and three-dimensional modeling in a mountainous basin area in Japan. We adopted a general-purpose terrestrial fluid-flow simulator (GETFLOWS) to develop a geological model and simulate the local water cycle. Local data relating to topology, geology, vegetation, land use, climate, and water use were collected from the existing literature and observations to assess the spatiotemporal variations of the water balance from 2011 to 2013. The characteristic structures of geology and soils, as found through field surveys, were parameterized for incorporation into the model. The simulated results were validated using observed groundwater levels and resulted in a Nash-Sutcliffe Model Efficiency Coefficient of 0.92. The results suggested that local groundwater flows across the watershed boundary and that the groundwater recharge rate, defined as the flux of water reaching the local unconfined groundwater table, has values similar to the level estimated in the `the lower soil layers on a long-term basis. This innovative method enables us to quantify the groundwater recharge rate and its spatiotemporal variability with high accuracy, which contributes to establishing a foundation for sustainable groundwater management.
Spatial variations in mortality in pelagic early life stages of a marine fish (Gadus morhua)
NASA Astrophysics Data System (ADS)
Langangen, Øystein; Stige, Leif C.; Yaragina, Natalia A.; Ottersen, Geir; Vikebø, Frode B.; Stenseth, Nils Chr.
2014-09-01
Mortality of pelagic eggs and larvae of marine fish is often assumed to be constant both in space and time due to lacking information. This may, however, be a gross oversimplification, as early life stages are likely to experience large variations in mortality both in time and space. In this paper we develop a method for estimating the spatial variability in mortality of eggs and larvae. The method relies on survey data and physical-biological particle-drift models to predict the drift of ichthyoplankton. Furthermore, the method was used to estimate the spatially resolved mortality field in the egg and larval stages of Barents Sea cod (Gadus morhua). We analyzed data from the Barents Sea for the period between 1959 and 1993 when there are two surveys available: a spring and a summer survey. An individual-based physical-biological particle-drift model, tailored to the egg and larval stages of Barents Sea cod, was used to predict the drift trajectories from the observed stage-specific distributions in spring to the time of observation in the summer, a drift time of approximately 45 days. We interpreted the spatial patterns in the differences between the predicted and observed abundance distributions in summer as reflecting the spatial patterns in mortality over the drift period. Using the estimated mortality fields, we show that the spatial variations in mortality might have a significant impact on survival to later life stages and we suggest that there may be trade-offs between increased early survival in off shore regions and reduced probability of ending up in the favorable nursing grounds in the Barents Sea. In addition, we show that accounting for the estimated mortality field, improves the correlation between a simulated recruitment index and observation-based indices of juvenile abundance.
Monitoring tigers with confidence.
Linkie, Matthew; Guillera-Arroita, Gurutzeta; Smith, Joseph; Rayan, D Mark
2010-12-01
With only 5% of the world's wild tigers (Panthera tigris Linnaeus, 1758) remaining since the last century, conservationists urgently need to know whether or not the management strategies currently being employed are effectively protecting these tigers. This knowledge is contingent on the ability to reliably monitor tiger populations, or subsets, over space and time. In the this paper, we focus on the 2 seminal methodologies (camera trap and occupancy surveys) that have enabled the monitoring of tiger populations with greater confidence. Specifically, we: (i) describe their statistical theory and application in the field; (ii) discuss issues associated with their survey designs and state variable modeling; and, (iii) discuss their future directions. These methods have had an unprecedented influence on increasing statistical rigor within tiger surveys and, also, surveys of other carnivore species. Nevertheless, only 2 published camera trap studies have gone beyond single baseline assessments and actually monitored population trends. For low density tiger populations (e.g. <1 adult tiger/100 km(2)) obtaining sufficient precision for state variable estimates from camera trapping remains a challenge because of insufficient detection probabilities and/or sample sizes. Occupancy surveys have overcome this problem by redefining the sampling unit (e.g. grid cells and not individual tigers). Current research is focusing on developing spatially explicit capture-mark-recapture models and estimating abundance indices from landscape-scale occupancy surveys, as well as the use of genetic information for identifying and monitoring tigers. The widespread application of these monitoring methods in the field now enables complementary studies on the impact of the different threats to tiger populations and their response to varying management intervention. © 2010 ISZS, Blackwell Publishing and IOZ/CAS.
Kimiafar, Khalil; Sarbaz, Masoumeh; Sheikhtaheri, Abbas
2016-01-01
Background: There are no general strategies or tools to evaluate daily lesson plans; however, assessments conducted using traditional methods usually include course plans. This study aimed to evaluate the strengths and weaknesses of online survey software in collecting data on education in medical fields and the application of such softwares to evaluate students' views and modification of lesson plans. Methods: After investigating the available online survey software, esurveypro was selected for assessing daily lesson plans. After using the software for one semester, a questionnaire was prepared to assess the advantages and disadvantages of this method and students’ views in a cross-sectional study. Results: The majority of the students (51.7%) rated the evaluation of classes per session (lesson plans) using the online survey as useful or very useful. About 51% (n=36) of the students considered this method effective in improving the management of each session, 67.1% (n=47) considered it effective in improving the management of sessions for the next semester, and 51.4% (n=36) said it had a high impact on improving the educational content of subsequent sessions. Finally, 61.4% (n=43) students expressed high and very high levels of satisfaction with using an online survey at each session. Conclusion: The use of online surveys may be appropriate to improve lesson plans and educational planning at different levels. This method can be used for other evaluations and for assessing people’s opinions at different levels of an educational system. PMID:28491839
Variability-selected active galactic nuclei from supernova search in the Chandra deep field south
NASA Astrophysics Data System (ADS)
Trevese, D.; Boutsia, K.; Vagnetti, F.; Cappellaro, E.; Puccetti, S.
2008-09-01
Context: Variability is a property shared by virtually all active galactic nuclei (AGNs), and was adopted as a criterion for their selection using data from multi epoch surveys. Low Luminosity AGNs (LLAGNs) are contaminated by the light of their host galaxies, and cannot therefore be detected by the usual colour techniques. For this reason, their evolution in cosmic time is poorly known. Consistency with the evolution derived from X-ray detected samples has not been clearly established so far, also because the low luminosity population consists of a mixture of different object types. LLAGNs can be detected by the nuclear optical variability of extended objects. Aims: Several variability surveys have been, or are being, conducted for the detection of supernovae (SNe). We propose to re-analyse these SNe data using a variability criterion optimised for AGN detection, to select a new AGN sample and study its properties. Methods: We analysed images acquired with the wide field imager at the 2.2 m ESO/MPI telescope, in the framework of the STRESS supernova survey. We selected the AXAF field centred on the Chandra Deep Field South where, besides the deep X-ray survey, various optical data exist, originating in the EIS and COMBO-17 photometric surveys and the spectroscopic database of GOODS. Results: We obtained a catalogue of 132 variable AGN candidates. Several of the candidates are X-ray sources. We compare our results with an HST variability study of X-ray and IR detected AGNs, finding consistent results. The relatively high fraction of confirmed AGNs in our sample (60%) allowed us to extract a list of reliable AGN candidates for spectroscopic follow-up observations. Table [see full text] is only available in electronic form at http://www.aanda.org
Development of the Stress of Immigration Survey: A Field Test Among Mexican Immigrant Women.
Sternberg, Rosa Maria; Nápoles, Anna Maria; Gregorich, Steven; Paul, Steven; Lee, Kathryn A; Stewart, Anita L
2016-01-01
The Stress of Immigration Survey (SOIS) is a screening tool used to assess immigration-related stress. The mixed methods approach included concept development, pretesting, field testing, and psychometric evaluation in a sample of 131 low-income women of Mexican descent. The 21-item SOIS screens for stress related to language, immigrant status, work issues, yearning for family and home country, and cultural dissonance. Mean scores ranged from 3.6 to 4.4 (a scale of 1-5, higher is more stress). Cronbach α values were more than 0.80 for all subscales. The SOIS may be a useful screening tool for detecting high levels of immigration-related stress in low-income Mexican immigrant women.
Buffington, Kevin J.; Dugger, Bruce D.; Thorne, Karen M.; Takekawa, John Y.
2016-01-01
Airborne light detection and ranging (lidar) is a valuable tool for collecting large amounts of elevation data across large areas; however, the limited ability to penetrate dense vegetation with lidar hinders its usefulness for measuring tidal marsh platforms. Methods to correct lidar elevation data are available, but a reliable method that requires limited field work and maintains spatial resolution is lacking. We present a novel method, the Lidar Elevation Adjustment with NDVI (LEAN), to correct lidar digital elevation models (DEMs) with vegetation indices from readily available multispectral airborne imagery (NAIP) and RTK-GPS surveys. Using 17 study sites along the Pacific coast of the U.S., we achieved an average root mean squared error (RMSE) of 0.072 m, with a 40–75% improvement in accuracy from the lidar bare earth DEM. Results from our method compared favorably with results from three other methods (minimum-bin gridding, mean error correction, and vegetation correction factors), and a power analysis applying our extensive RTK-GPS dataset showed that on average 118 points were necessary to calibrate a site-specific correction model for tidal marshes along the Pacific coast. By using available imagery and with minimal field surveys, we showed that lidar-derived DEMs can be adjusted for greater accuracy while maintaining high (1 m) resolution.
Guan, Shane; Vignola, Joseph; Judge, John; Turo, Diego
2015-12-01
Offshore oil and gas exploration using seismic airguns generates intense underwater pulses that could cause marine mammal hearing impairment and/or behavioral disturbances. However, few studies have investigated the resulting multipath propagation and reverberation from airgun pulses. This research uses continuous acoustic recordings collected in the Arctic during a low-level open-water shallow marine seismic survey, to measure noise levels between airgun pulses. Two methods were used to quantify noise levels during these inter-pulse intervals. The first, based on calculating the root-mean-square sound pressure level in various sub-intervals, is referred to as the increment computation method, and the second, which employs the Hilbert transform to calculate instantaneous acoustic amplitudes, is referred to as the Hilbert transform method. Analyses using both methods yield similar results, showing that the inter-pulse sound field exceeds ambient noise levels by as much as 9 dB during relatively quiet conditions. Inter-pulse noise levels are also related to the source distance, probably due to the higher reverberant conditions of the very shallow water environment. These methods can be used to quantify acoustic environment impacts from anthropogenic transient noises (e.g., seismic pulses, impact pile driving, and sonar pings) and to address potential acoustic masking affecting marine mammals.
Collins, Brian D.; Brown, Kristin M.; Fairley, Helen C.
2008-01-01
This report presents the results of an evaluation of terrestrial light detection and ranging (LIDAR) for monitoring geomorphic change at archeological sites located within Grand Canyon National Park, Ariz. Traditionally, topographic change-detection studies have used total station methods for the collection of data related to key measurable features of site erosion such as the location of thalwegs and knickpoints of gullies that traverse archeological sites (for example, Pederson and others, 2003). Total station methods require survey teams to walk within and on the features of interest within the archeological sites to take accurate measurements. As a result, site impacts may develop such as trailing, damage to cryptogamic crusts, and surface compaction that can exacerbate future erosion of the sites. National Park Service (NPS) resource managers have become increasingly concerned that repeated surveys for research and monitoring purposes may have a detrimental impact on the resources that researchers are trying to study and protect. Beginning in 2006, the Sociocultural Program of the U.S. Geological Survey's (USGS) Grand Canyon Monitoring and Research Center (GCMRC) initiated an evaluation of terrestrial LIDAR as a new monitoring tool that might enhance data quality and reduce site impacts. This evaluation was conducted as one part of an ongoing study to develop objective, replicable, quantifiable monitoring protocols for tracking the status and trend of variables affecting archeological site condition along the Colorado River corridor. The overall study consists of two elements: (1) an evaluation of the methodology through direct comparison to geomorphologic metrics already being collected by total station methods (this report) and (2) an evaluation of terrestrial LIDAR's ability to detect topographic change through the collection of temporally different datasets (a report on this portion of the study is anticipated early in 2009). The main goals of the first element of study were to 1. test the methodology and survey protocols of terrestrial LIDAR surveying under actual archeological site field conditions, 2. examine the ability to collect topographic data of entire archeological sites given such constraints as vegetation and rough topography, and 3. evaluate the ability of terrestrial LIDAR to accurately map the locations of key geomorphic features already being collected by total station methods such as gully thalweg and knickpoint locations. This report focuses on the ability of terrestrial LIDAR to duplicate total station methods, including typical erosion-related change features such as the plan view gully thalweg location and the gully thalweg long profile. The report also presents information concerning the use of terrestrial LIDAR for archeological site monitoring in a general sense. In addition, a detailed comparison of the site impacts caused by both total station and terrestrial LIDAR survey methods is presented using a suite of indicators, including total field survey time, field footstep count, and data-processing time. A thorough discussion of the relative benefits and limitations of using terrestrial LIDAR for monitoring erosion-induced changes at archeological sites in Grand Canyon National Park concludes this report.
The JCMT BISTRO Survey: The Magnetic Field Strength in the Orion A Filament
NASA Astrophysics Data System (ADS)
Pattle, Kate; Ward-Thompson, Derek; Berry, David; Hatchell, Jennifer; Chen, Huei-Ru; Pon, Andy; Koch, Patrick M.; Kwon, Woojin; Kim, Jongsoo; Bastien, Pierre; Cho, Jungyeon; Coudé, Simon; Di Francesco, James; Fuller, Gary; Furuya, Ray S.; Graves, Sarah F.; Johnstone, Doug; Kirk, Jason; Kwon, Jungmi; Lee, Chang Won; Matthews, Brenda C.; Mottram, Joseph C.; Parsons, Harriet; Sadavoy, Sarah; Shinnaga, Hiroko; Soam, Archana; Hasegawa, Tetsuo; Lai, Shih-Ping; Qiu, Keping; Friberg, Per
2017-09-01
We determine the magnetic field strength in the OMC 1 region of the Orion A filament via a new implementation of the Chandrasekhar-Fermi method using observations performed as part of the James Clerk Maxwell Telescope (JCMT) B-Fields In Star-forming Region Observations (BISTRO) survey with the POL-2 instrument. We combine BISTRO data with archival SCUBA-2 and HARP observations to find a plane-of-sky magnetic field strength in OMC 1 of {B}{pos}=6.6+/- 4.7 mG, where δ {B}{pos}=4.7 mG represents a predominantly systematic uncertainty. We develop a new method for measuring angular dispersion, analogous to unsharp masking. We find a magnetic energy density of ˜ 1.7× {10}-7 J m-3 in OMC 1, comparable both to the gravitational potential energy density of OMC 1 (˜10-7 J m-3) and to the energy density in the Orion BN/KL outflow (˜10-7 J m-3). We find that neither the Alfvén velocity in OMC 1 nor the velocity of the super-Alfvénic outflow ejecta is sufficiently large for the BN/KL outflow to have caused large-scale distortion of the local magnetic field in the ˜500 yr lifetime of the outflow. Hence, we propose that the hourglass field morphology in OMC 1 is caused by the distortion of a primordial cylindrically symmetric magnetic field by the gravitational fragmentation of the filament and/or the gravitational interaction of the BN/KL and S clumps. We find that OMC 1 is currently in or near magnetically supported equilibrium, and that the current large-scale morphology of the BN/KL outflow is regulated by the geometry of the magnetic field in OMC 1, and not vice versa.
Petitot, Maud; Manceau, Nicolas; Geniez, Philippe; Besnard, Aurélien
2014-09-01
Setting up effective conservation strategies requires the precise determination of the targeted species' distribution area and, if possible, its local abundance. However, detection issues make these objectives complex for most vertebrates. The detection probability is usually <1 and is highly dependent on species phenology and other environmental variables. The aim of this study was to define an optimized survey protocol for the Mediterranean amphibian community, that is, to determine the most favorable periods and the most effective sampling techniques for detecting all species present on a site in a minimum number of field sessions and a minimum amount of prospecting effort. We visited 49 ponds located in the Languedoc region of southern France on four occasions between February and June 2011. Amphibians were detected using three methods: nighttime call count, nighttime visual encounter, and daytime netting. The detection nondetection data obtained was then modeled using site-occupancy models. The detection probability of amphibians sharply differed between species, the survey method used and the date of the survey. These three covariates also interacted. Thus, a minimum of three visits spread over the breeding season, using a combination of all three survey methods, is needed to reach a 95% detection level for all species in the Mediterranean region. Synthesis and applications: detection nondetection surveys combined to site occupancy modeling approach are powerful methods that can be used to estimate the detection probability and to determine the prospecting effort necessary to assert that a species is absent from a site.
Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys
2017-01-01
ABSTRACT Background: The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. Objectives: We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. Methods: The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Results: Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. Conclusions: We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys. PMID:28145817
NASA Astrophysics Data System (ADS)
Kupila, Juho
2017-04-01
Since the 1990s, a huge amount of data related to the groundwater and soil has been collected in several regional projects in Finland. EU -funded project "The coordination of groundwater protection and aggregates industry in Finnish Lapland, phase II" started in July 2016 and it covers the last unstudied areas in these projects in Finland. Project is carried out by Geological Survey of Finland (GTK), University of Oulu and Finnish Environment Institute and the main topic is to consolidate the groundwater protection and extractable use of soil resource in Lapland area. As earlier, several kinds of studies are also carried out throughout this three-year research and development project. These include e.g. drilling with setting up of groundwater observation wells, GPR-survey and many kinds of point-type observations, like sampling and general mapping on the field. Due to size of a study area (over 80 000 km2, about one quarter of a total area of Finland), improvement of the field work methods has become essential. To the general observation on the field, GTK has developed a specific mobile applications for Android -devices. With these Apps, data can be easily collected for example from a certain groundwater area and then uploaded directly to the GTK's database. Collected information may include sampling data, photos, layer observations, groundwater data etc. and it is all linked to the current GPS-location. New data is also easily available for post-processing. In this project the benefits of these applications will be field-tested and e.g. ergonomics, economy and usability in general will be taken account and related to the other data collecting methods, like working with heavy fieldwork laptops. Although these Apps are designed for usage in GTK's projects, they are free to download from Google Play for anyone interested. Geological Survey of Finland has the main role in this project with support from national and local authorities and stakeholders. Project is funded by European Regional Development Fund with support from local communes, branch enterprises and executive quarters of the project. Implementation period is 2016-2019.
Hu, Mengyao; Gremel, Garrett W; Kirlin, John A; West, Brady T
2017-05-01
Background: Food acquisition diary surveys are important for studying food expenditures, factors affecting food acquisition decisions, and relations between these decisions with selected measures of health (e.g., body mass index, self-reported health). However, to our knowledge, no studies have evaluated the errors associated with these diary surveys, which can bias survey estimates and research findings. The use of paradata, which has been largely ignored in previous literature on diary surveys, could be useful for studying errors in these surveys. Objective: We used paradata to assess survey errors in the National Household Food Acquisition and Purchase Survey (FoodAPS). Methods: To evaluate the patterns of nonresponse over the diary period, we fit a multinomial logistic regression model to data from this 1-wk diary survey. We also assessed factors influencing respondents' probability of reporting food acquisition events during the diary process by using logistic regression models. Finally, with the use of an ordinal regression model, we studied factors influencing respondents' perceived ease of participation in the survey. Results: As the diary period progressed, nonresponse increased, especially for those starting the survey on Friday (where the odds of a refusal increased by 12% with each fielding day). The odds of reporting food acquisition events also decreased by 6% with each additional fielding day. Similarly, the odds of reporting ≥1 food-away-from-home event (i.e., meals, snacks, and drinks obtained outside the home) decreased significantly over the fielding period. Male respondents, larger households, households that eat together less often, and households with frequent guests reported a significantly more difficult time getting household members to participate, as did non-English-speaking households and households currently experiencing difficult financial conditions. Conclusions: Nonresponse and underreporting of food acquisition events tended to increase in the FoodAPS as data collection proceeded. This analysis of paradata available in the FoodAPS revealed these errors and suggests methodologic improvements for future food acquisition surveys. © 2017 American Society for Nutrition.
Synergies between exoplanet surveys and variable star research
NASA Astrophysics Data System (ADS)
Kovacs, Geza
2017-09-01
With the discovery of the first transiting extrasolar planetary system back in 1999, a great number of projects started to hunt for other similar systems. Because the incidence rate of such systems was unknown and the length of the shallow transit events is only a few percent of the orbital period, the goal was to monitor continuously as many stars as possible for at least a period of a few months. Small aperture, large field of view automated telescope systems have been installed with a parallel development of new data reduction and analysis methods, leading to better than 1% per data point precision for thousands of stars. With the successful launch of the photometric satellites CoRoT and Kepler, the precision increased further by one-two orders of magnitude. Millions of stars have been analyzed and searched for transits. In the history of variable star astronomy this is the biggest undertaking so far, resulting in photometric time series inventories immensely valuable for the whole field. In this review we briefly discuss the methods of data analysis that were inspired by the main science driver of these surveys and highlight some of the most interesting variable star results that impact the field of variable star astronomy.
NASA Technical Reports Server (NTRS)
Henderson, R. G.; Thomas, G. S.; Nalepka, R. F.
1975-01-01
Methods of performing signature extension, using LANDSAT-1 data, are explored. The emphasis is on improving the performance and cost-effectiveness of large area wheat surveys. Two methods were developed: ASC, and MASC. Two methods, Ratio, and RADIFF, previously used with aircraft data were adapted to and tested on LANDSAT-1 data. An investigation into the sources and nature of between scene data variations was included. Initial investigations into the selection of training fields without in situ ground truth were undertaken.
NASA Astrophysics Data System (ADS)
Sugg, Paul G.
The state curriculum in Texas was amended in 1997 to require field investigations in all science classes. This study attempted to explore and add to the research base of information about the efficacy and use of field investigations as important but often underutilized tools in science and environmental instruction. The underlying theme of the study was the view that urban students should receive more instruction in natural settings and that doing so not only improves science learning but also environmental literacy. A sequential mixed method approach was employed to investigate teacher and principal participation in, and perceptions of, outdoor field investigations in public school instruction. In the quantitative phase, surveys were administered to 277 science teachers and 96 principals in a large, urban, Texas district. Significant differences (p ≤ .05) were found between teachers and principals who utilized the field investigation and those who did not. In the qualitative phase, 12 teachers were interviewed about various factors related to field investigations. The study found that while science teachers generally have positive opinions of field studies, awareness of the requirement to provide them is low and obstacles remain which prevent teachers from employing the method. Many science teachers are not providing opportunities for their students to experience science and environmental education instruction in natural settings. Half of the teachers and more than a third of the principals surveyed were not aware of the requirement to provide students with field investigations. The study generated quantitative and qualitative evidence demonstrating that teacher use of the field investigation method is strongly linked to the following factors: (a) teacher and principal awareness of the requirement; (b) administrator support; (c) funding for transportation to appropriate natural settings; (d) intra or interdepartmental competition for limited field trip opportunities; and (e) teacher training. The presence or absence of these factors has significant implications for policy and practice in science and environmental education. The findings supply data that could be used by state and local administrators, curriculum superintendents, science curriculum leaders, elementary and secondary principals, and science educators to guide and improve science and environmental instruction in the state.
Clement, Matthew; O'Keefe, Joy M; Walters, Brianne
2015-01-01
While numerous methods exist for estimating abundance when detection is imperfect, these methods may not be appropriate due to logistical difficulties or unrealistic assumptions. In particular, if highly mobile taxa are frequently absent from survey locations, methods that estimate a probability of detection conditional on presence will generate biased abundance estimates. Here, we propose a new estimator for estimating abundance of mobile populations using telemetry and counts of unmarked animals. The estimator assumes that the target population conforms to a fission-fusion grouping pattern, in which the population is divided into groups that frequently change in size and composition. If assumptions are met, it is not necessary to locate all groups in the population to estimate abundance. We derive an estimator, perform a simulation study, conduct a power analysis, and apply the method to field data. The simulation study confirmed that our estimator is asymptotically unbiased with low bias, narrow confidence intervals, and good coverage, given a modest survey effort. The power analysis provided initial guidance on survey effort. When applied to small data sets obtained by radio-tracking Indiana bats, abundance estimates were reasonable, although imprecise. The proposed method has the potential to improve abundance estimates for mobile species that have a fission-fusion social structure, such as Indiana bats, because it does not condition detection on presence at survey locations and because it avoids certain restrictive assumptions.
ERIC Educational Resources Information Center
Kramsch, Claire
2014-01-01
This paper surveys the research methods and approaches used in the multidisciplinary field of applied language studies or language education over the last fourty years. Drawing on insights gained in psycho- and sociolinguistics, educational linguistics and linguistic anthropology with regard to language and culture, it is organized around five…
Transdisciplinarity in Research: Perspectives of Early Career Faculty
ERIC Educational Resources Information Center
Moore, Megan; Martinson, Melissa L.; Nurius, Paula S.; Kemp, Susan P.
2018-01-01
Background: Early career faculty experiences and perspectives on transdisciplinary research are important yet understudied. Methods: Assistant professors at 50 top-ranked social work programs completed an online survey assessing perspectives on the salience of transdisciplinary training in their field, obstacles to or negative impacts of…
NASA Astrophysics Data System (ADS)
Puhek, Miro
The present doctoral thesis presents a case study within the scope of which real and virtual field trips have been compared. The emphasis of the study was on determining the levels of knowledge gain effectiveness in the fields of biology and ecology in the final triad (third) of lower secondary school education. The analysis included students completing various tasks along the Maribor Island natural education trail, which had been digitized and inserted into Geopedia. The study was conducted in autumn of 2011 and included 464 students (enrolled in grades from 6 to 9) from 11 lower secondary schools located in the Maribor area. The results have generally shown minute differences between the levels of knowledge acquisition effectiveness between both field trips. During the real field trip, the majority of the students included in the study achieved better results particularly at tasks where they were able to benefit from first-hand experience. During the virtual field trip, individual students were more successful at tasks where they were allowed to access a computer in order to obtain additional information. Within the scope of the study, we had also surveyed lower secondary and secondary school teachers on the frequency of including field trips in the curriculum, on the obstacles that the teachers faced with regard to including field work in it, and on their views on real and virtual field trips. The survey included a total of 386 teachers, the majority whom were teaching the subjects of biology, geography, and natural science. The results have shown that the surveyed teachers regard field trips as a very important educational method that particularly encourages experience-based learning in nature. The views of the teachers on virtual field trips were generally positive, but only when regarded and applied as a supplemental teaching tool and not as a substitute for real field trips.
Examining Menstrual Tracking to Inform the Design of Personal Informatics Tools
Epstein, Daniel A.; Lee, Nicole B.; Kang, Jennifer H.; Agapie, Elena; Schroeder, Jessica; Pina, Laura R.; Fogarty, James; Kientz, Julie A.; Munson, Sean A.
2017-01-01
We consider why and how women track their menstrual cycles, examining their experiences to uncover design opportunities and extend the field's understanding of personal informatics tools. To understand menstrual cycle tracking practices, we collected and analyzed data from three sources: 2,000 reviews of popular menstrual tracking apps, a survey of 687 people, and follow-up interviews with 12 survey respondents. We find that women track their menstrual cycle for varied reasons that include remembering and predicting their period as well as informing conversations with healthcare providers. Participants described six methods of tracking their menstrual cycles, including use of technology, awareness of their premenstrual physiological states, and simply remembering. Although women find apps and calendars helpful, these methods are ineffective when predictions of future menstrual cycles are inaccurate. Designs can create feelings of exclusion for gender and sexual minorities. Existing apps also generally fail to consider life stages that women experience, including young adulthood, pregnancy, and menopause. Our findings encourage expanding the field's conceptions of personal informatics. PMID:28516176
Thermometric well testing on the Vietnam offshore
DOE Office of Scientific and Technical Information (OSTI.GOV)
San, T.N.; Shtyrlin, V.F.; Vakhitov, G.G.
1994-12-31
It is impossible to control and adjust an oil and gas field development without determining the flow intervals of production wells. For that it is preferable to get production profiles by using the downhole flowmeter. There are, however, some main restrictions for wide-spread application of them on the offshore of Vietnam as follows: the flowmeter spinner velocity cannot indicate correctly in the open hole wells having a nonuniform diameter; it is unable to carry out in the case when the tubing shoe is lower than top formation on 300--500m. In this paper, the authors present a summary of temperature profilemore » method to determine the flowing and intaking intervals of wells drilled in basement of the White Tiger Field on Vietnam offshore. For the last 2 years more than 30 wells were surveyed by this method in the above mentioned conditions. This paper presents the theory and practice of well temperature profile surveys, the concrete examples of data interpretation using the software Oiltest.« less
NASA Astrophysics Data System (ADS)
Krzyżek, Robert; Przewięźlikowska, Anna
2017-12-01
When surveys of corners of building structures are carried out, surveyors frequently use a compilation of two surveying methods. The first one involves the determination of several corners with reference to a geodetic control using classical methods of surveying field details. The second method relates to the remaining corner points of a structure, which are determined in sequence from distance-distance intersection, using control linear values of the wall faces of the building, the so-called tie distances. This paper assesses the accuracy of coordinates of corner points of a building structure, determined using the method of distance-distance intersection, based on the corners which had previously been determined by the conducted surveys tied to a geodetic control. It should be noted, however, that such a method of surveying the corners of building structures from linear measures is based on the details of the first-order accuracy, while the regulations explicitly allow such measurement only for the details of the second- and third-order accuracy. Therefore, a question arises whether this legal provision is unfounded, or whether surveyors are acting not only against the applicable standards but also without due diligence while performing surveys? This study provides answers to the formulated problem. The main purpose of the study was to verify whether the actual method which is used in practice for surveying building structures allows to obtain the required accuracy of coordinates of the points being determined, or whether it should be strictly forbidden. The results of the conducted studies clearly demonstrate that the problem is definitely more complex. Eventually, however, it might be assumed that assessment of the accuracy in determining a location of corners of a building using a combination of two different surveying methods will meet the requirements of the regulation [MIA, 2011), subject to compliance with relevant baseline criteria, which have been presented in this study. Observance of the proposed boundary conditions would allow for frequent performance of surveys of building structures by surveyors (from tie distances), while maintaining the applicable accuracy criteria. This would allow for the inclusion of surveying documentation into the national geodetic and cartographic documentation center database pursuant to the legal bases.
A Survey of Colormaps in Visualization
Zhou, Liang; Hansen, Charles D.
2016-01-01
Colormaps are a vital method for users to gain insights into data in a visualization. With a good choice of colormaps, users are able to acquire information in the data more effectively and efficiently. In this survey, we attempt to provide readers with a comprehensive review of colormap generation techniques and provide readers a taxonomy which is helpful for finding appropriate techniques to use for their data and applications. Specifically, we first briefly introduce the basics of color spaces including color appearance models. In the core of our paper, we survey colormap generation techniques, including the latest advances in the field by grouping these techniques into four classes: procedural methods, user-study based methods, rule-based methods, and data-driven methods; we also include a section on methods that are beyond pure data comprehension purposes. We then classify colormapping techniques into a taxonomy for readers to quickly identify the appropriate techniques they might use. Furthermore, a representative set of visualization techniques that explicitly discuss the use of colormaps is reviewed and classified based on the nature of the data in these applications. Our paper is also intended to be a reference of colormap choices for readers when they are faced with similar data and/or tasks. PMID:26513793
Hunting Faint Dwarf Galaxies in the Field Using Integrated Light Surveys
NASA Astrophysics Data System (ADS)
Danieli, Shany; van Dokkum, Pieter; Conroy, Charlie
2018-03-01
We discuss the approach of searching the lowest mass dwarf galaxies, ≲ {10}6 {M}ȯ , in the general field, using integrated light surveys. By exploring the limiting surface brightness-spatial resolution (μ eff,lim‑θ) parameter space, we suggest that faint field dwarfs in the Local Volume, between 3 and 10 Mpc, are expected to be detected very effectively and in large numbers using integrated light photometric surveys, complementary to the classical star counts method. We use a sample of dwarf galaxies in the Local Group to construct relations between their photometric and structural parameters, M *–μ eff,V and M *–R eff. We use these relations, along with assumed functional forms for the halo mass function and the stellar mass–halo mass (SMHM) relation, to calculate the lowest detectable stellar masses in the Local Volume and the expected number of galaxies as a function of the limiting surface brightness and spatial resolution. The number of detected galaxies depends mostly on the limiting surface brightness for distances >3 Mpc, while spatial resolution starts to play a role for galaxies at distances >8 Mpc. Surveys with μ eff,lim ∼ 30 mag arcsec‑2 should be able to detect galaxies with stellar masses down to ∼104 M ⊙ in the Local Volume. Depending on the form of the SMHM relation, the expected number of dwarf galaxies with distances between 3 and 10 Mpc is 0.04–0.35 per square degree, assuming a limiting surface brightness of ∼29–30 mag arcsec‑2 and a spatial resolution <4″. We plan to search for a population of low-mass dwarf galaxies in the field by performing a blank wide field photometric survey with the Dragonfly Telephoto Array, an imaging system optimized for the detection of extended ultra low surface brightness structures.
Initial Data Release of the Kepler-INT Survey
NASA Astrophysics Data System (ADS)
Greiss, S.; Steeghs, D.; Gänsicke, B. T.; Martín, E. L.; Groot, P. J.; Irwin, M. J.; González-Solares, E.; Greimel, R.; Knigge, C.; Østensen, R. H.; Verbeek, K.; Drew, J. E.; Drake, J.; Jonker, P. G.; Ripepi, V.; Scaringi, S.; Southworth, J.; Still, M.; Wright, N. J.; Farnhill, H.; van Haaften, L. M.; Shah, S.
2012-07-01
This paper describes the first data release of the Kepler-INT Survey (KIS) that covers a 116 deg2 region of the Cygnus and Lyra constellations. The Kepler field is the target of the most intensive search for transiting planets to date. Despite the fact that the Kepler mission provides superior time-series photometry, with an enormous impact on all areas of stellar variability, its field lacks optical photometry complete to the confusion limit of the Kepler instrument necessary for selecting various classes of targets. For this reason, we follow the observing strategy and data reduction method used in the IPHAS and UVEX galactic plane surveys in order to produce a deep optical survey of the Kepler field. This initial release concerns data taken between 2011 May and August, using the Isaac Newton Telescope on the island of La Palma. Four broadband filters were used, U, g, r, i, as well as one narrowband one, Hα, reaching down to a 10σ limit of ~20th mag in the Vega system. Observations covering ~50 deg2, thus about half of the field, passed our quality control thresholds and constitute this first data release. We derive a global photometric calibration by placing the KIS magnitudes as close as possible to the Kepler Input Catalog (KIC) photometry. The initial data release catalog containing around 6 million sources from all the good photometric fields is available for download from the KIS Web site (www.astro.warwick.ac.uk/research/kis/) as well as via MAST (KIS magnitudes can be retrieved using the MAST enhanced target search page http://archive.stsci.edu/kepler/kepler_fov/search.php and also via Casjobs at MAST Web site http://mastweb.stsci.edu/kplrcasjobs/).
INITIAL DATA RELEASE OF THE KEPLER-INT SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greiss, S.; Steeghs, D.; Gaensicke, B. T.
2012-07-15
This paper describes the first data release of the Kepler-INT Survey (KIS) that covers a 116 deg{sup 2} region of the Cygnus and Lyra constellations. The Kepler field is the target of the most intensive search for transiting planets to date. Despite the fact that the Kepler mission provides superior time-series photometry, with an enormous impact on all areas of stellar variability, its field lacks optical photometry complete to the confusion limit of the Kepler instrument necessary for selecting various classes of targets. For this reason, we follow the observing strategy and data reduction method used in the IPHAS andmore » UVEX galactic plane surveys in order to produce a deep optical survey of the Kepler field. This initial release concerns data taken between 2011 May and August, using the Isaac Newton Telescope on the island of La Palma. Four broadband filters were used, U, g, r, i, as well as one narrowband one, H{alpha}, reaching down to a 10{sigma} limit of {approx}20th mag in the Vega system. Observations covering {approx}50 deg{sup 2}, thus about half of the field, passed our quality control thresholds and constitute this first data release. We derive a global photometric calibration by placing the KIS magnitudes as close as possible to the Kepler Input Catalog (KIC) photometry. The initial data release catalog containing around 6 million sources from all the good photometric fields is available for download from the KIS Web site (www.astro.warwick.ac.uk/research/kis/) as well as via MAST (KIS magnitudes can be retrieved using the MAST enhanced target search page http://archive.stsci.edu/kepler/kepler{sub f}ov/search.php and also via Casjobs at MAST Web site http://mastweb.stsci.edu/kplrcasjobs/).« less
Self-consistent Bulge/Disk/Halo Galaxy Dynamical Modeling Using Integral Field Kinematics
NASA Astrophysics Data System (ADS)
Taranu, D. S.; Obreschkow, D.; Dubinski, J. J.; Fogarty, L. M. R.; van de Sande, J.; Catinella, B.; Cortese, L.; Moffett, A.; Robotham, A. S. G.; Allen, J. T.; Bland-Hawthorn, J.; Bryant, J. J.; Colless, M.; Croom, S. M.; D'Eugenio, F.; Davies, R. L.; Drinkwater, M. J.; Driver, S. P.; Goodwin, M.; Konstantopoulos, I. S.; Lawrence, J. S.; López-Sánchez, Á. R.; Lorente, N. P. F.; Medling, A. M.; Mould, J. R.; Owers, M. S.; Power, C.; Richards, S. N.; Tonini, C.
2017-11-01
We introduce a method for modeling disk galaxies designed to take full advantage of data from integral field spectroscopy (IFS). The method fits equilibrium models to simultaneously reproduce the surface brightness, rotation, and velocity dispersion profiles of a galaxy. The models are fully self-consistent 6D distribution functions for a galaxy with a Sérsic profile stellar bulge, exponential disk, and parametric dark-matter halo, generated by an updated version of GalactICS. By creating realistic flux-weighted maps of the kinematic moments (flux, mean velocity, and dispersion), we simultaneously fit photometric and spectroscopic data using both maximum-likelihood and Bayesian (MCMC) techniques. We apply the method to a GAMA spiral galaxy (G79635) with kinematics from the SAMI Galaxy Survey and deep g- and r-band photometry from the VST-KiDS survey, comparing parameter constraints with those from traditional 2D bulge-disk decomposition. Our method returns broadly consistent results for shared parameters while constraining the mass-to-light ratios of stellar components and reproducing the H I-inferred circular velocity well beyond the limits of the SAMI data. Although the method is tailored for fitting integral field kinematic data, it can use other dynamical constraints like central fiber dispersions and H I circular velocities, and is well-suited for modeling galaxies with a combination of deep imaging and H I and/or optical spectra (resolved or otherwise). Our implementation (MagRite) is computationally efficient and can generate well-resolved models and kinematic maps in under a minute on modern processors.
Field trials of line transect methods applied to estimation of desert tortoise abundance
Anderson, David R.; Burnham, Kenneth P.; Lubow, Bruce C.; Thomas, L. E. N.; Corn, Paul Stephen; Medica, Philip A.; Marlow, R.W.
2001-01-01
We examine the degree to which field observers can meet the assumptions underlying line transect sampling to monitor populations of desert tortoises (Gopherus agassizii). We present the results of 2 field trials using artificial tortoise models in 3 size classes. The trials were conducted on 2 occasions on an area south of Las Vegas, Nevada, where the density of the test population was known. In the first trials, conducted largely by experienced biologists who had been involved in tortoise surveys for many years, the density of adult tortoise models was well estimated (-3.9% bias), while the bias was higher (-20%) for subadult tortoise models. The bias for combined data was -12.0%. The bias was largely attributed to the failure to detect all tortoise models on or near the transect centerline. The second trials were conducted with a group of largely inexperienced student volunteers and used somewhat different searching methods, and the results were similar to the first trials. Estimated combined density of subadult and adult tortoise models had a negative bias (-7.3%), again attributable to failure to detect some models on or near the centerline. Experience in desert tortoise biology, either comparing the first and second trials or in the second trial with 2 experienced biologists versus 16 novices, did not have an apparent effect on the quality of the data or the accuracy of the estimates. Observer training, specific to line transect sampling, and field testing are important components of a reliable survey. Line transect sampling represents a viable method for large-scale monitoring of populations of desert tortoise; however, field protocol must be improved to assure the key assumptions are met.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xiao-Dong; Park, Changbom; Forero-Romero, J. E.
We propose a method based on the redshift dependence of the Alcock-Paczynski (AP) test to measure the expansion history of the universe. It uses the isotropy of the galaxy density gradient field to constrain cosmological parameters. If the density parameter Ω {sub m} or the dark energy equation of state w are incorrectly chosen, the gradient field appears to be anisotropic with the degree of anisotropy varying with redshift. We use this effect to constrain the cosmological parameters governing the expansion history of the universe. Although redshift-space distortions (RSD) induced by galaxy peculiar velocities also produce anisotropies in the gradientmore » field, these effects are close to uniform in magnitude over a large range of redshift. This makes the redshift variation of the gradient field anisotropy relatively insensitive to the RSD. By testing the method on mock surveys drawn from the Horizon Run 3 cosmological N-body simulations, we demonstrate that the cosmological parameters can be estimated without bias. Our method is complementary to the baryon acoustic oscillation or topology methods as it depends on D{sub AH} , the product of the angular diameter distance and the Hubble parameter.« less
Digital Archiving of People Flow by Recycling Large-Scale Social Survey Data of Developing Cities
NASA Astrophysics Data System (ADS)
Sekimoto, Y.; Watanabe, A.; Nakamura, T.; Horanont, T.
2012-07-01
Data on people flow has become increasingly important in the field of business, including the areas of marketing and public services. Although mobile phones enable a person's position to be located to a certain degree, it is a challenge to acquire sufficient data from people with mobile phones. In order to grasp people flow in its entirety, it is important to establish a practical method of reconstructing people flow from various kinds of existing fragmentary spatio-temporal data such as social survey data. For example, despite typical Person Trip Survey Data collected by the public sector showing the fragmentary spatio-temporal positions accessed, the data are attractive given the sufficiently large sample size to estimate the entire flow of people. In this study, we apply our proposed basic method to Japan International Cooperation Agency (JICA) PT data pertaining to developing cities around the world, and we propose some correction methods to resolve the difficulties in applying it to many cities and stably to infrastructure data.
Quality of survey reporting in nephrology journals: a methodologic review.
Li, Alvin Ho-Ting; Thomas, Sonia M; Farag, Alexandra; Duffett, Mark; Garg, Amit X; Naylor, Kyla L
2014-12-05
Survey research is an important research method used to determine individuals' attitudes, knowledge, and behaviors; however, as with other research methods, inadequate reporting threatens the validity of results. This study aimed to describe the quality of reporting of surveys published between 2001 and 2011 in the field of nephrology. The top nephrology journals were systematically reviewed (2001-2011: American Journal of Kidney Diseases, Nephrology Dialysis Transplantation, and Kidney International; 2006-2011: Clinical Journal of the American Society of Nephrology) for studies whose primary objective was to collect and report survey results. Included were nephrology journals with a heavy focus on clinical research and high impact factors. All titles and abstracts were screened in duplicate. Surveys were excluded if they were part of a multimethod study, evaluated only psychometric characteristics, or used semi-structured interviews. Information was collected on survey and respondent characteristics, questionnaire development (e.g., pilot testing), psychometric characteristics (e.g., validity and reliability), survey methods used to optimize response rate (e.g., system of multiple contacts), and response rate. After a screening of 19,970 citations, 216 full-text articles were reviewed and 102 surveys were included. Approximately 85% of studies reported a response rate. Almost half of studies (46%) discussed how they developed their questionnaire and only a quarter of studies (28%) mentioned the validity or reliability of the questionnaire. The only characteristic that improved over the years was the proportion of articles reporting missing data (2001-2004: 46.4%; 2005-2008: 61.9%; and 2009-2011: 84.8%; respectively) (P<0.01). The quality of survey reporting in nephrology journals remains suboptimal. In particular, reporting of the validity and reliability of the questionnaire must be improved. Guidelines to improve survey reporting and increase transparency are clearly needed. Copyright © 2014 by the American Society of Nephrology.
43 CFR 3861.1-3 - Plats and field notes of mineral surveys.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Plats and field notes of mineral surveys...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) MINERAL PATENT APPLICATIONS Surveys and Plats § 3861.1-3 Plats and field notes of mineral surveys. When the patent is issued...
43 CFR 3861.1-3 - Plats and field notes of mineral surveys.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Plats and field notes of mineral surveys...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) MINERAL PATENT APPLICATIONS Surveys and Plats § 3861.1-3 Plats and field notes of mineral surveys. When the patent is issued...
43 CFR 3861.1-3 - Plats and field notes of mineral surveys.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Plats and field notes of mineral surveys...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) MINERAL PATENT APPLICATIONS Surveys and Plats § 3861.1-3 Plats and field notes of mineral surveys. When the patent is issued...
43 CFR 3861.1-3 - Plats and field notes of mineral surveys.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Plats and field notes of mineral surveys...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) MINERAL PATENT APPLICATIONS Surveys and Plats § 3861.1-3 Plats and field notes of mineral surveys. When the patent is issued...
Grimes, D.J.; Marranzino, A.P.
1968-01-01
Two spectrographic methods are used in mobile field laboratories of the U. S. Geological Survey. In the direct-current arc method, the ground sample is mixed with graphite powder, packed into an electrode crater, and burned to completion. Thirty elements are determined. In the spark method, the sample, ground to pass a 150-mesh screen, is digested in hydrofluoric acid followed by evaporation to dryness and dissolution in aqua regia. The solution is fed into the spark gap by means of a rotating-disk electrode arrangement and is excited with an alternating-current spark discharge. Fourteen elements are determined. In both techniques, light is recorded on Spectrum Analysis No. 1, 35-millimeter film, and the spectra are compared visually with those of standard films.
Nonperturbative light-front Hamiltonian methods
NASA Astrophysics Data System (ADS)
Hiller, J. R.
2016-09-01
We examine the current state-of-the-art in nonperturbative calculations done with Hamiltonians constructed in light-front quantization of various field theories. The language of light-front quantization is introduced, and important (numerical) techniques, such as Pauli-Villars regularization, discrete light-cone quantization, basis light-front quantization, the light-front coupled-cluster method, the renormalization group procedure for effective particles, sector-dependent renormalization, and the Lanczos diagonalization method, are surveyed. Specific applications are discussed for quenched scalar Yukawa theory, ϕ4 theory, ordinary Yukawa theory, supersymmetric Yang-Mills theory, quantum electrodynamics, and quantum chromodynamics. The content should serve as an introduction to these methods for anyone interested in doing such calculations and as a rallying point for those who wish to solve quantum chromodynamics in terms of wave functions rather than random samplings of Euclidean field configurations.
Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys
2017-01-01
The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys.
ERIC Educational Resources Information Center
Baumgartner, Erin; Zabin, Chela J.
2006-01-01
The study of "zonation", the distribution of plants and animals into distinct spatial areas, is a great way to introduce students to basic ecological concepts. Students can conduct methodical, quantifying surveys of zones in areas as diverse as mudflats, beaches, forests, wetlands, and fields. Students collect data from these areas with field…
PGMS: a case study of collecting PDA-based geo-tagged malaria-related survey data.
Zhou, Ying; Lobo, Neil F; Wolkon, Adam; Gimnig, John E; Malishee, Alpha; Stevenson, Jennifer; Sulistyawati; Collins, Frank H; Madey, Greg
2014-09-01
Using mobile devices, such as personal digital assistants (PDAs), smartphones, tablet computers, etc., to electronically collect malaria-related field data is the way for the field questionnaires in the future. This case study seeks to design a generic survey framework PDA-based geo-tagged malaria-related data collection tool (PGMS) that can be used not only for large-scale community-level geo-tagged electronic malaria-related surveys, but also for a wide variety of electronic data collections of other infectious diseases. The framework includes two parts: the database designed for subsequent cross-sectional data analysis and the customized programs for the six study sites (two in Kenya, three in Indonesia, and one in Tanzania). In addition to the framework development, we also present our methods used when configuring and deploying the PDAs to 1) reduce data entry errors, 2) conserve battery power, 3) field install the programs onto dozens of handheld devices, 4) translate electronic questionnaires into local languages, 5) prevent data loss, and 6) transfer data from PDAs to computers for future analysis and storage. Since 2008, PGMS has successfully accomplished quite a few surveys that recorded 10,871 compounds and households, 52,126 persons, and 17,100 bed nets from the six sites. These numbers are still growing. © The American Society of Tropical Medicine and Hygiene.
An International Survey of Aquaponics Practitioners
Love, David C.; Fry, Jillian P.; Genello, Laura; Hill, Elizabeth S.; Frederick, J. Adam; Li, Ximin; Semmens, Ken
2014-01-01
Aquaponics, a combination of fish farming and soilless plant farming, is growing in popularity and gaining attention as an important and potentially more sustainable method of food production. The aim of this study was to document and analyze the production methods, experiences, motivations, and demographics of aquaponics practitioners in the United States (US) and internationally. The survey was distributed online using a chain sampling method that relied on referrals from initial respondents, with 809 respondents meeting the inclusion criteria. The majority of respondents were from the US (80%), male (78%), and had at least a high school degree (91%). The mean age of respondents was 47±13 years old. Most respondents (52%) had three years or less of aquaponics experience. Respondents typically raised tilapia or ornamental fish and a variety of leafy green vegetables, herbs, and fruiting crops. Respondents were most often motivated to become involved in aquaponics to grow their own food, for environmental sustainability reasons, and for personal health reasons. Many respondents employed more than one method to raise crops, and used alternative or environmentally sustainable sources of energy, water, and fish feed. In general, our findings suggest that aquaponics is a dynamic and rapidly growing field with participants who are actively experimenting with and adopting new technologies. Additional research and outreach is needed to evaluate and communicate best practices within the field. This survey is the first large-scale effort to track aquaponics in the US and provides information that can better inform policy, research, and education efforts regarding aquaponics as it matures and possibly evolves into a mainstream form of agriculture. PMID:25029125
An international survey of aquaponics practitioners.
Love, David C; Fry, Jillian P; Genello, Laura; Hill, Elizabeth S; Frederick, J Adam; Li, Ximin; Semmens, Ken
2014-01-01
Aquaponics, a combination of fish farming and soilless plant farming, is growing in popularity and gaining attention as an important and potentially more sustainable method of food production. The aim of this study was to document and analyze the production methods, experiences, motivations, and demographics of aquaponics practitioners in the United States (US) and internationally. The survey was distributed online using a chain sampling method that relied on referrals from initial respondents, with 809 respondents meeting the inclusion criteria. The majority of respondents were from the US (80%), male (78%), and had at least a high school degree (91%). The mean age of respondents was 47±13 years old. Most respondents (52%) had three years or less of aquaponics experience. Respondents typically raised tilapia or ornamental fish and a variety of leafy green vegetables, herbs, and fruiting crops. Respondents were most often motivated to become involved in aquaponics to grow their own food, for environmental sustainability reasons, and for personal health reasons. Many respondents employed more than one method to raise crops, and used alternative or environmentally sustainable sources of energy, water, and fish feed. In general, our findings suggest that aquaponics is a dynamic and rapidly growing field with participants who are actively experimenting with and adopting new technologies. Additional research and outreach is needed to evaluate and communicate best practices within the field. This survey is the first large-scale effort to track aquaponics in the US and provides information that can better inform policy, research, and education efforts regarding aquaponics as it matures and possibly evolves into a mainstream form of agriculture.
Monitoring gray wolf populations using multiple survey methods
Ausband, David E.; Rich, Lindsey N.; Glenn, Elizabeth M.; Mitchell, Michael S.; Zager, Pete; Miller, David A.W.; Waits, Lisette P.; Ackerman, Bruce B.; Mack, Curt M.
2013-01-01
The behavioral patterns and large territories of large carnivores make them challenging to monitor. Occupancy modeling provides a framework for monitoring population dynamics and distribution of territorial carnivores. We combined data from hunter surveys, howling and sign surveys conducted at predicted wolf rendezvous sites, and locations of radiocollared wolves to model occupancy and estimate the number of gray wolf (Canis lupus) packs and individuals in Idaho during 2009 and 2010. We explicitly accounted for potential misidentification of occupied cells (i.e., false positives) using an extension of the multi-state occupancy framework. We found agreement between model predictions and distribution and estimates of number of wolf packs and individual wolves reported by Idaho Department of Fish and Game and Nez Perce Tribe from intensive radiotelemetry-based monitoring. Estimates of individual wolves from occupancy models that excluded data from radiocollared wolves were within an average of 12.0% (SD = 6.0) of existing statewide minimum counts. Models using only hunter survey data generally estimated the lowest abundance, whereas models using all data generally provided the highest estimates of abundance, although only marginally higher. Precision across approaches ranged from 14% to 28% of mean estimates and models that used all data streams generally provided the most precise estimates. We demonstrated that an occupancy model based on different survey methods can yield estimates of the number and distribution of wolf packs and individual wolf abundance with reasonable measures of precision. Assumptions of the approach including that average territory size is known, average pack size is known, and territories do not overlap, must be evaluated periodically using independent field data to ensure occupancy estimates remain reliable. Use of multiple survey methods helps to ensure that occupancy estimates are robust to weaknesses or changes in any 1 survey method. Occupancy modeling may be useful for standardizing estimates across large landscapes, even if survey methods differ across regions, allowing for inferences about broad-scale population dynamics of wolves.
Reconstructing the Initial Density Field of the Local Universe: Methods and Tests with Mock Catalogs
NASA Astrophysics Data System (ADS)
Wang, Huiyuan; Mo, H. J.; Yang, Xiaohu; van den Bosch, Frank C.
2013-07-01
Our research objective in this paper is to reconstruct an initial linear density field, which follows the multivariate Gaussian distribution with variances given by the linear power spectrum of the current cold dark matter model and evolves through gravitational instabilities to the present-day density field in the local universe. For this purpose, we develop a Hamiltonian Markov Chain Monte Carlo method to obtain the linear density field from a posterior probability function that consists of two components: a prior of a Gaussian density field with a given linear spectrum and a likelihood term that is given by the current density field. The present-day density field can be reconstructed from galaxy groups using the method developed in Wang et al. Using a realistic mock Sloan Digital Sky Survey DR7, obtained by populating dark matter halos in the Millennium simulation (MS) with galaxies, we show that our method can effectively and accurately recover both the amplitudes and phases of the initial, linear density field. To examine the accuracy of our method, we use N-body simulations to evolve these reconstructed initial conditions to the present day. The resimulated density field thus obtained accurately matches the original density field of the MS in the density range 0.3 \\lesssim \\rho /\\bar{\\rho } \\lesssim 20 without any significant bias. In particular, the Fourier phases of the resimulated density fields are tightly correlated with those of the original simulation down to a scale corresponding to a wavenumber of ~1 h Mpc-1, much smaller than the translinear scale, which corresponds to a wavenumber of ~0.15 h Mpc-1.
Advances in Collaborative Filtering
NASA Astrophysics Data System (ADS)
Koren, Yehuda; Bell, Robert
The collaborative filtering (CF) approach to recommenders has recently enjoyed much interest and progress. The fact that it played a central role within the recently completed Netflix competition has contributed to its popularity. This chapter surveys the recent progress in the field. Matrix factorization techniques, which became a first choice for implementing CF, are described together with recent innovations. We also describe several extensions that bring competitive accuracy into neighborhood methods, which used to dominate the field. The chapter demonstrates how to utilize temporal models and implicit feedback to extend models accuracy. In passing, we include detailed descriptions of some the central methods developed for tackling the challenge of the Netflix Prize competition.
The Chandra Deep Wide-Field Survey: Completing the new generation of Chandra extragalactic surveys
NASA Astrophysics Data System (ADS)
Hickox, Ryan
2016-09-01
Chandra X-ray surveys have revolutionized our view of the growth of black holes across cosmic time. Recently, fundamental questions have emerged about the connection of AGN to their host large scale structures that clearly demand a wide, deep survey over a large area, comparable to the recent extensive Chandra surveys in smaller fields. We propose the Chandra Deep Wide-Field Survey (CDWFS) covering the central 6 sq. deg in the Bootes field, totaling 1.025 Ms (building on 550 ks from the HRC GTO program). CDWFS will efficiently probe a large cosmic volume, allowing us to carry out accurate new investigations of the connections between black holes and their large-scale structures, and will complete the next generation surveys that comprise a key part of Chandra's legacy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sternberg, B.K.; Thomas, S.J.
1992-12-01
The overall objective of the project was to apply a new high-resolution imaging system to water resource investigations. This imaging system measures the ellipticity of received magnetic-field components. The source of the magnetic field is a long-line transmitter emitting frequencies from 30 Hz to 30 kHz. A new high-accuracy calibration method was used to enhance the resolution of the measurements. The specific objectives included: (1) refine the system hardware and software based on these investigations, (2) learn the limitations of this technology in practical water resource investigations, and (3) improve interpretation techniques to extract the highest possible resolution. Successful fieldmore » surveys were run at: (1) San Xavier Mine, Arizona - flow of injected fluid was monitored with the system. (2) Avra Valley, Arizona - subsurface stratigraphy was imaged. A survey at a third site was less successful; interpreted resistivity section does not agree with nearby well logs. Surveys are continuing at this site.« less
Development of the Vertical Electro Magnetic Profiling (VEMP) method
NASA Astrophysics Data System (ADS)
Miura, Yasuo; Osato, Kazumi; Takasugi, Shinji; Muraoka, Hirofumi; Yasukawa, Kasumi
1996-09-01
As a part of the "Deep-Seated Geothermal Resources Survey (DSGR)" project being undertaken by the New Energy and Industrial Technology Development Organization (NEDO), the "Vertical Electro Magnetic Profiling (VEMP)" method is being developed to accurately obtain deep resistivity structures. The VEMP method takes multi-frequency three-component magnetic field data in an open hole well using controlled source transmitters emitted at the surface (either loop or grounded-wire sources). Numerical simulations using EM3D have demonstrated that phase data of the VEMP method is not only very sensitive to the general resistivity structure, but will also indicate the presence of deeper anomalies. Forward modelling was used to determine the required transmitter moments for various grounded-wire and loop sources for a field test using the WD-1 well in the Kakkonda geothermal area. VEMP logging of the WD-1 well was carried out in May 1994 and the processed field data matches the computer simulations quite well.
Are rapid population estimates accurate? A field trial of two different assessment methods.
Grais, Rebecca F; Coulombier, Denis; Ampuero, Julia; Lucas, Marcelino E S; Barretto, Avertino T; Jacquier, Guy; Diaz, Francisco; Balandine, Serge; Mahoudeau, Claude; Brown, Vincent
2006-09-01
Emergencies resulting in large-scale displacement often lead to populations resettling in areas where basic health services and sanitation are unavailable. To plan relief-related activities quickly, rapid population size estimates are needed. The currently recommended Quadrat method estimates total population by extrapolating the average population size living in square blocks of known area to the total site surface. An alternative approach, the T-Square, provides a population estimate based on analysis of the spatial distribution of housing units taken throughout a site. We field tested both methods and validated the results against a census in Esturro Bairro, Beira, Mozambique. Compared to the census (population: 9,479), the T-Square yielded a better population estimate (9,523) than the Quadrat method (7,681; 95% confidence interval: 6,160-9,201), but was more difficult for field survey teams to implement. Although applicable only to similar sites, several general conclusions can be drawn for emergency planning.
Multi-scale assessment of human-induced changes to ...
Context: Land use change and forest degradation have myriad effects on tropical ecosystems. Yet their consequences for low-order streams remain very poorly understood, including in the world´s largest freshwater basin, the Amazon.Objectives: Determine the degree to which physical and chemical characteristics of the instream habitat of low-order Amazonian streams change in response to past local- and catchment-level anthropogenic disturbances. Methods: To do so, we collected field instream habitat (i.e., physical habitat and water quality) and landscape data from 99 stream sites in two eastern Brazilian Amazon regions. We used random forest regression trees to assess the relative importance of different predictor variables in determining changes in instream habitat response variables. Adaptations the USEPA’s National Aquatic Resource Survey (NARS) designs, field methods, and approaches for assessing ecological condition have been applied in state and basin stream surveys throughout the U.S., and also in countries outside of the U.S. These applications not only provide valuable tests of the NARS approaches, but generate new understandings of natural and anthropogenic controls on biota and physical habitat in streams. Results from applications in Brazil, for example, not only aid interpretation of the condition of Brazilian streams, but also refine approaches for interpreting aquatic resource surveys in the U.S. and elsewhere. In this article, the authors des
SkICAT: A cataloging and analysis tool for wide field imaging surveys
NASA Technical Reports Server (NTRS)
Weir, N.; Fayyad, U. M.; Djorgovski, S. G.; Roden, J.
1992-01-01
We describe an integrated system, SkICAT (Sky Image Cataloging and Analysis Tool), for the automated reduction and analysis of the Palomar Observatory-ST ScI Digitized Sky Survey. The Survey will consist of the complete digitization of the photographic Second Palomar Observatory Sky Survey (POSS-II) in three bands, comprising nearly three Terabytes of pixel data. SkICAT applies a combination of existing packages, including FOCAS for basic image detection and measurement and SAS for database management, as well as custom software, to the task of managing this wealth of data. One of the most novel aspects of the system is its method of object classification. Using state-of-theart machine learning classification techniques (GID3* and O-BTree), we have developed a powerful method for automatically distinguishing point sources from non-point sources and artifacts, achieving comparably accurate discrimination a full magnitude fainter than in previous Schmidt plate surveys. The learning algorithms produce decision trees for classification by examining instances of objects classified by eye on both plate and higher quality CCD data. The same techniques will be applied to perform higher-level object classification (e.g., of galaxy morphology) in the near future. Another key feature of the system is the facility to integrate the catalogs from multiple plates (and portions thereof) to construct a single catalog of uniform calibration and quality down to the faintest limits of the survey. SkICAT also provides a variety of data analysis and exploration tools for the scientific utilization of the resulting catalogs. We include initial results of applying this system to measure the counts and distribution of galaxies in two bands down to Bj is approximately 21 mag over an approximate 70 square degree multi-plate field from POSS-II. SkICAT is constructed in a modular and general fashion and should be readily adaptable to other large-scale imaging surveys.
Adaption from LWIR to visible wavebands of methods to describe the population of GEO belt debris
NASA Astrophysics Data System (ADS)
Meng, Kevin; Murray-Krezan, Jeremy; Seitzer, Patrick
2018-05-01
Prior efforts to characterize the number of GEO belt debris objects by statistically analyzing the distribution of debris as a function of size have relied on techniques unique to infrared measurements of the debris. Specifically the infrared measurement techniques permitted inference of the characteristic size of the debris. This report describes a method to adapt the previous techniques and measurements to visible wavebands. Results will be presented using data from a NASA optical, visible band survey of objects near the geosynchronous orbit, GEO belt. This survey used the University of Michigan's 0.6-m Curtis-Schmidt telescope, Michigan Orbital DEbris Survey Telescope (MODEST), located at Cerro Tololo Inter-American Observatory in Chile. The system is equipped with a scanning CCD with a field of view of 1.6°×1.6°, and can detect objects smaller than 20 cm diameter at GEO.
Reconstructing the gravitational field of the local Universe
NASA Astrophysics Data System (ADS)
Desmond, Harry; Ferreira, Pedro G.; Lavaux, Guilhem; Jasche, Jens
2018-03-01
Tests of gravity at the galaxy scale are in their infancy. As a first step to systematically uncovering the gravitational significance of galaxies, we map three fundamental gravitational variables - the Newtonian potential, acceleration and curvature - over the galaxy environments of the local Universe to a distance of approximately 200 Mpc. Our method combines the contributions from galaxies in an all-sky redshift survey, haloes from an N-body simulation hosting low-luminosity objects, and linear and quasi-linear modes of the density field. We use the ranges of these variables to determine the extent to which galaxies expand the scope of generic tests of gravity and are capable of constraining specific classes of model for which they have special significance. Finally, we investigate the improvements afforded by upcoming galaxy surveys.
Efficiency of snake sampling methods in the Brazilian semiarid region.
Mesquita, Paula C M D; Passos, Daniel C; Cechin, Sonia Z
2013-09-01
The choice of sampling methods is a crucial step in every field survey in herpetology. In countries where time and financial support are limited, the choice of the methods is critical. The methods used to sample snakes often lack objective criteria, and the traditional methods have apparently been more important when making the choice. Consequently researches using not-standardized methods are frequently found in the literature. We have compared four commonly used methods for sampling snake assemblages in a semiarid area in Brazil. We compared the efficacy of each method based on the cost-benefit regarding the number of individuals and species captured, time, and financial investment. We found that pitfall traps were the less effective method in all aspects that were evaluated and it was not complementary to the other methods in terms of abundance of species and assemblage structure. We conclude that methods can only be considered complementary if they are standardized to the objectives of the study. The use of pitfall traps in short-term surveys of the snake fauna in areas with shrubby vegetation and stony soil is not recommended.
Through the looking glass: Applications of ground-penetrating radar in archaeology
NASA Astrophysics Data System (ADS)
Stamos, Antonia
The focus of this dissertation is to present the results of four years' worth of geophysical surveying at four major archaeological sites in Greece and the benefits to the archaeological community. The ground penetrating radar offers an inexpensive, non-destructive solution to the problem of deciding how much of a site is worth excavating and which areas would yield the most promising results. An introduction to the ground penetrating radar, or GPR, the equipment necessary to conduct a geophysical survey in the field, and the methods of data collection and subsequent data processing are all addressed. The benefits to the archeological community are many, and future excavations will incorporate such an important tool for a greater understanding of the site. The history of GPR work in the archaeological field has grown at an astounding rate from its beginnings as a simple tool for petroleum and mining services in the beginning of the twentieth century. By mid-century, the GPR was first applied to archaeological sites rather than its common use by utility companies in locating pipes, cables, tunnels, and shafts. Although the preliminary surveys were little more than a search to locate buried walls, the success of these initial surveys paved the ground for future surveys at other archaeological sites, many testing the radar's efficacy with a myriad of soil conditions and properties. The four sites in which geophysical surveys with a ground penetrating radar were conducted are Azorias on the island of Crete, Kolonna on the island of Aegina, Mochlos Island and Coastal Mochlos on the island of Crete, and Mycenae in the Peloponnese on mainland Greece. These case studies are first presented in terms of their geographical location, their mythology and etymology, where applicable, along with a brief history of excavation and occupation of the site. Additional survey methods were used at Mycenae, including aerial photography and ERDAS Imagine, a silo locating program now applied for "surface surveying." Each survey site is introduced via geographical location and proximity to known features, as well as with a comprehensive breakdown of the data into real-time depth, or depth-slices, for better identification of features.
Rodriguez-Falces, Javier
2013-12-01
In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are difficult to describe and conceptualize. In addition, most traditional approaches aimed at describing extracellular potentials consist of complex mathematical machinery that gives no chance for physical interpretation. The aim of the present study is to present a new method to teach the formation of extracellular potentials around a muscle fiber from both a descriptive and quantitative perspective. The implementation of this method was tested through a written exam and a satisfaction survey. The new method enhanced the ability of students to visualize the generation of bioelectrical potentials. In addition, the new approach improved students' understanding of how changes in the fiber-to-electrode distance and in the shape of the excitation source are translated into changes in the extracellular potential. The survey results show that combining general principles of electrical fields with accurate graphic imagery gives students an intuitive, yet quantitative, feel for electrophysiological signals and enhances their motivation to continue their studies in the biomedical engineering field.
We developed an index of relative bed stability (LRBS) based on low flow survey data collected using the U.S. Environmental Protection Agency’s Environmental Monitoring and Assessment Program (EMAP) field methods to assess anthropogenic sedimentation in streams. LRBS is the log ...
A View of Studies on Bibliometrics and Related Subjects in Japan.
ERIC Educational Resources Information Center
Miyamoto, Sadaaki; And Others
1989-01-01
Surveys studies on bibliometrics and related subjects in Japan, classifying them into studies on bibliometrics and applications of bibliometrics. Examines applications of fuzzy set theory to document retrieval using bibliometric techniques. Emphasizes the models and methods used in common between bibliometrics and other fields of science. (SR)
ABCs of Diversifying Information Resources among Rice Smallholders of Ghana
ERIC Educational Resources Information Center
Misiko, M.; Halm, E.
2016-01-01
Purpose: We investigated how information resource diversification can enhance smallholder agricultural knowledge in Ghana. Design/Methodology/Approach: Study tools and methods were questionnaire survey (N = 200), focus group discussion (N = 1), in-depth interviews (N = 18) and field direct observation. Findings: This study shows there existed…
Tethered acoustic doppler current profiler platforms for measuring streamflow
Rehmel, Michael S.; Stewart, James A.; Morlock, Scott E.
2003-01-01
A tethered-platform design with a trimaran hull and 900-megahertz radio modems is now commercially available. Continued field use has resulted in U.S. Geological Survey procedures for making tethered-platform discharge measurements, including methods for tethered-boat deployment, moving-bed tests, and measurement of edge distances.
Coping Styles of Failing Brunei Vocational Students
ERIC Educational Resources Information Center
Mundia, Lawrence; Salleh, Sallimah
2017-01-01
Purpose: The purpose of this paper is to determine the prevalence of two types of underachieving students (n = 246) (active failing (AF) and passive failing (PF)) in Brunei vocational and technical education (VTE) institutions and their patterns of coping. Design/methodology/approach: The field survey method was used to directly reach many…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... cognitive interviews, focus groups, Pilot household interviews, and experimental research in laboratory and field settings, both for applied questionnaire evaluation and more basic research on response errors in surveys. The most common evaluation method is the cognitive interview, in which a questionnaire design...
Exemplary Rehabilitation Educators Experiences of Mentoring and Beginning Their Careers
ERIC Educational Resources Information Center
Marini, Irmo; Graf, Noreen M.; Reed, Bruce J.
2017-01-01
Purpose: To investigate the career experiences and mentoring advice of nationally recognized rehabilitation educators who have excelled and proffer strategies for success to newcomers to the field. Method: The authors surveyed via Qualtrics 28 rehabilitation educators regarding their career experiences with open and closed structured questions and…
Techniques for obtaining subjective response to vertical vibration
NASA Technical Reports Server (NTRS)
Clarke, M. J.; Oborne, D. J.
1975-01-01
Laboratory experiments were performed to validate the techniques used for obtaining ratings in the field surveys carried out by the University College of Swansea. In addition, attempts were made to evaluate the basic form of the human response to vibration. Some of the results obtained by different methods are described.
Triangulation-based 3D surveying borescope
NASA Astrophysics Data System (ADS)
Pulwer, S.; Steglich, P.; Villringer, C.; Bauer, J.; Burger, M.; Franz, M.; Grieshober, K.; Wirth, F.; Blondeau, J.; Rautenberg, J.; Mouti, S.; Schrader, S.
2016-04-01
In this work, a measurement concept based on triangulation was developed for borescopic 3D-surveying of surface defects. The integration of such measurement system into a borescope environment requires excellent space utilization. The triangulation angle, the projected pattern, the numerical apertures of the optical system, and the viewing angle were calculated using partial coherence imaging and geometric optical raytracing methods. Additionally, optical aberrations and defocus were considered by the integration of Zernike polynomial coefficients. The measurement system is able to measure objects with a size of 50 μm in all dimensions with an accuracy of +/- 5 μm. To manage the issue of a low depth of field while using an optical high resolution system, a wavelength dependent aperture was integrated. Thereby, we are able to control depth of field and resolution of the optical system and can use the borescope in measurement mode with high resolution and low depth of field or in inspection mode with low resolution and higher depth of field. First measurements of a demonstrator system are in good agreement with our simulations.
Evaluation of body weight of sea cucumber Apostichopus japonicus by computer vision
NASA Astrophysics Data System (ADS)
Liu, Hui; Xu, Qiang; Liu, Shilin; Zhang, Libin; Yang, Hongsheng
2015-01-01
A postichopus japonicus (Holothuroidea, Echinodermata) is an ecological and economic species in East Asia. Conventional biometric monitoring method includes diving for samples and weighing above water, with highly variable in weight measurement due to variation in the quantity of water in the respiratory tree and intestinal content of this species. Recently, video survey method has been applied widely in biometric detection on underwater benthos. However, because of the high flexibility of A. japonicus body, video survey method of monitoring is less used in sea cucumber. In this study, we designed a model to evaluate the wet weight of A. japonicus, using machine vision technology combined with a support vector machine (SVM) that can be used in field surveys on the A. japonicus population. Continuous dorsal images of free-moving A. japonicus individuals in seawater were captured, which also allows for the development of images of the core body edge as well as thorn segmentation. Parameters that include body length, body breadth, perimeter and area, were extracted from the core body edge images and used in SVM regression, to predict the weight of A. japonicus and for comparison with a power model. Results indicate that the use of SVM for predicting the weight of 33 A. japonicus individuals is accurate ( R 2=0.99) and compatible with the power model ( R 2 =0.96). The image-based analysis and size-weight regression models in this study may be useful in body weight evaluation of A. japonicus in lab and field study.
The Exoplanet Microlensing Survey by the Proposed WFIRST Observatory
NASA Technical Reports Server (NTRS)
Barry, Richard; Kruk, Jeffrey; Anderson, Jay; Beaulieu, Jean-Philippe; Bennett, David P.; Catanzarite, Joseph; Cheng, Ed; Gaudi, Scott; Gehrels, Neil; Kane, Stephen;
2012-01-01
The New Worlds, New Horizons report released by the Astronomy and Astrophysics Decadal Survey Board in 2010 listed the Wide Field Infrared Survey Telescope (WFIRST) as the highest-priority large space mission for the . coming decade. This observatory will provide wide-field imaging and slitless spectroscopy at near infrared wavelengths. The scientific goals are to obtain a statistical census of exoplanets using gravitational microlensing. measure the expansion history of and the growth of structure in the Universe by multiple methods, and perform other astronomical surveys to be selected through a guest observer program. A Science Definition Team has been established to assist NASA in the development of a Design Reference Mission that accomplishes this diverse array of science programs with a single observatory. In this paper we present the current WFIRST payload concept and the expected capabilities for planet detection. The observatory. with science goals that are complimentary to the Kepler exoplanet transit mission, is designed to complete the statistical census of planetary systems in the Galaxy, from habitable Earth-mass planets to free floating planets, including analogs to all of the planets in our Solar System except Mercury. The exoplanet microlensing survey will observe for 500 days spanning 5 years. This long temporal baseline will enable the determination of the masses for most detected exoplanets down to 0.1 Earth masses.
Probing Primordial Non-Gaussianity with Weak-lensing Minkowski Functionals
NASA Astrophysics Data System (ADS)
Shirasaki, Masato; Yoshida, Naoki; Hamana, Takashi; Nishimichi, Takahiro
2012-11-01
We study the cosmological information contained in the Minkowski functionals (MFs) of weak gravitational lensing convergence maps. We show that the MFs provide strong constraints on the local-type primordial non-Gaussianity parameter f NL. We run a set of cosmological N-body simulations and perform ray-tracing simulations of weak lensing to generate 100 independent convergence maps of a 25 deg2 field of view for f NL = -100, 0 and 100. We perform a Fisher analysis to study the degeneracy among other cosmological parameters such as the dark energy equation of state parameter w and the fluctuation amplitude σ8. We use fully nonlinear covariance matrices evaluated from 1000 ray-tracing simulations. For upcoming wide-field observations such as those from the Subaru Hyper Suprime-Cam survey with a proposed survey area of 1500 deg2, the primordial non-Gaussianity can be constrained with a level of f NL ~ 80 and w ~ 0.036 by weak-lensing MFs. If simply scaled by the effective survey area, a 20,000 deg2 lensing survey using the Large Synoptic Survey Telescope will yield constraints of f NL ~ 25 and w ~ 0.013. We show that these constraints can be further improved by a tomographic method using source galaxies in multiple redshift bins.
McGarvey, Daniel J.; Falke, Jeffrey A.; Li, Hiram W.; Li, Judith; Hauer, F. Richard; Lamberti, G.A.
2017-01-01
Methods to sample fishes in stream ecosystems and to analyze the raw data, focusing primarily on assemblage-level (all fish species combined) analyses, are presented in this chapter. We begin with guidance on sample site selection, permitting for fish collection, and information-gathering steps to be completed prior to conducting fieldwork. Basic sampling methods (visual surveying, electrofishing, and seining) are presented with specific instructions for estimating population sizes via visual, capture-recapture, and depletion surveys, in addition to new guidance on environmental DNA (eDNA) methods. Steps to process fish specimens in the field including the use of anesthesia and preservation of whole specimens or tissue samples (for genetic or stable isotope analysis) are also presented. Data analysis methods include characterization of size-structure within populations, estimation of species richness and diversity, and application of fish functional traits. We conclude with three advanced topics in assemblage-level analysis: multidimensional scaling (MDS), ecological networks, and loop analysis.
Neville, Stephen; Adams, Jeffery; Cook, Catherine
2016-12-01
Undertaking qualitative research with vulnerable populations is a complex and challenging process for researchers. Traditional and common modes of collecting qualitative data with these groups have been via face-to-face recorded interviews. This article reports on three internet-based data collection methods; email and synchronous online interviews, as well as online qualitative survey. The key characteristics of using email, sychronous online interviews and an online qualitative survey including the strengths and limitations of each are presented. Reflections and insights on the use of these internet-based data collection methods are provided to encourage researchers to embrace technology and move away from using traditional face-to-face interviews when researching with vulnerable populations. Using the internet to collect qualitative data offers additional ways to gather qualitative data over traditional data collection methods. The use of alternative interview methods may encourage participation of vulnerable participants.
A Multi-resolution, Multi-epoch Low Radio Frequency Survey of the Kepler K2 Mission Campaign 1 Field
NASA Astrophysics Data System (ADS)
Tingay, S. J.; Hancock, P. J.; Wayth, R. B.; Intema, H.; Jagannathan, P.; Mooley, K.
2016-10-01
We present the first dedicated radio continuum survey of a Kepler K2 mission field, Field 1, covering the North Galactic Cap. The survey is wide field, contemporaneous, multi-epoch, and multi-resolution in nature and was conducted at low radio frequencies between 140 and 200 MHz. The multi-epoch and ultra wide field (but relatively low resolution) part of the survey was provided by 15 nights of observation using the Murchison Widefield Array (MWA) over a period of approximately a month, contemporaneous with K2 observations of the field. The multi-resolution aspect of the survey was provided by the low resolution (4‧) MWA imaging, complemented by non-contemporaneous but much higher resolution (20″) observations using the Giant Metrewave Radio Telescope (GMRT). The survey is, therefore, sensitive to the details of radio structures across a wide range of angular scales. Consistent with other recent low radio frequency surveys, no significant radio transients or variables were detected in the survey. The resulting source catalogs consist of 1085 and 1468 detections in the two MWA observation bands (centered at 154 and 185 MHz, respectively) and 7445 detections in the GMRT observation band (centered at 148 MHz), over 314 square degrees. The survey is presented as a significant resource for multi-wavelength investigations of the more than 21,000 target objects in the K2 field. We briefly examine our survey data against K2 target lists for dwarf star types (stellar types M and L) that have been known to produce radio flares.
Development of a Research Participants’ Perception Survey to Improve Clinical Research
Yessis, Jennifer L.; Kost, Rhonda G.; Lee, Laura M.; Coller, Barry S.; Henderson, David K.
2012-01-01
Abstract Introduction: Clinical research participants’ perceptions regarding their experiences during research protocols provide outcome‐based insights into the effectiveness of efforts to protect rights and safety, and opportunities to enhance participants’ clinical research experiences. Use of validated surveys measuring patient‐centered outcomes is standard in hospitals, yet no instruments exist to assess outcomes of clinical research processes. Methods: We derived survey questions from data obtained from focus groups comprised of research participants and professionals. We assessed the survey for face/content validity, and privacy/confidentiality protections and fielded it to research participants at 15 centers. We conducted analyses of response rates, sample characteristics, and psychometrics, including survey and item completion and analysis, internal consistency, item internal consistency, criterion‐related validity, and item usefulness. Responses were tested for fit into existing patient‐centered dimensions of care and new clinical research dimensions using Cronbach's alpha coefficient. Results: Surveys were mailed to 18,890 individuals; 4,961 were returned (29%). Survey completion was 89% overall; completion rates exceeded 90% for 88 of 93 evaluable items. Questions fit into three dimensions of patient‐centered care and two novel clinical research dimensions (Cronbach's alpha for dimensions: 0.69–0.85). Conclusions: The validated survey offers a new method for assessing and improving outcomes of clinical research processes. Clin Trans Sci 2012; Volume 5: 452–460 PMID:23253666
Sines and Cosines. Part 2 of 3
NASA Technical Reports Server (NTRS)
Apostol, Tom M. (Editor)
1993-01-01
The Law of Sines and the Law of Cosines are introduced and demonstrated in this 'Project Mathematics' series video using both film footage and computer animation. This video deals primarily with the mathematical field of Trigonometry and explains how these laws were developed and their applications. One significant use is geographical and geological surveying. This includes both the triangulation method and the spirit leveling method. With these methods, it is shown how the height of the tallest mountain in the world, Mt. Everest, was determined.
Submillimeter Galaxy Surveys with AzTEC
NASA Astrophysics Data System (ADS)
Wilson, Grant W.; Ade, P. A.; Aretxaga, I.; Austermann, J.; Battersby, C.; Bock, J. J.; Glenn, J.; Golwala, S. R.; Haig, D.; Hughes, D. H.; Kang, Y.; Kim, S.; Lowenthal, J.; Mauskopf, P. D.; Perera, T.; Scott, K.; Roberts, C.; Yoon, I.; Yun, M. S.
2006-06-01
We describe a recent large scale survey of the Submillimeter Galaxy (SMG) population by AzTEC, a 144 element bolometer camera, on the 15m diameter James Clerk Maxwell Telescope. From November 2005 to February 2006, over 400 hours of telescope time were spent imaging over 1 square degree of sky with an area weighted target sensitivity of 0.7 mJy rms. Several fields with large multi-wavelength data sets were mapped including the Subaru/XMM-Newton Deep Survey field, the Lockmann Hole, GOODS-N, and a subset of the COSMOS field. In addition we mapped fields spanning a wide range of environments including several regions with known mass over-density. Together this represents the largest/deepest survey of the SMG population. Herein we report on the technical details of the surveys, describe the reduction pipeline, and show preliminary results from a subsection of the survey fields.
Phillips, Andrew W; Friedman, Benjamin T; Utrankar, Amol; Ta, Andrew Q; Reddy, Shalini T; Durning, Steven J
2017-02-01
To establish a baseline overall response rate for surveys of health professions trainees, determine strategies associated with improved response rates, and evaluate for the presence of nonresponse bias. The authors performed a comprehensive analysis of all articles published in Academic Medicine, Medical Education, and Advances in Health Sciences Education in 2013, recording response rates. Additionally, they reviewed nonresponse bias analyses and factors suggested in other fields to affect response rate including survey delivery method, prenotification, and incentives. The search yielded 732 total articles; of these, 356 were research articles, and of these, 185 (52.0%) used at least one survey. Of these, 66 articles (35.6%) met inclusion criteria and yielded 73 unique surveys. Of the 73 surveys used, investigators reported a response rate for 63.0% of them; response rates ranged from 26.6% to 100%, mean (standard deviation) 71.3% (19.5%). Investigators reported using incentives for only 16.4% of the 73 surveys. The only survey methodology factor significantly associated with response rate was single- vs. multi-institutional surveys (respectively, 74.6% [21.2%] vs. 62.0% [12.8%], P = .022). Notably, statistical power for all analyses was limited. No articles evaluated for nonresponse bias. Approximately half of the articles evaluated used a survey as part of their methods. Limited data are available to establish a baseline response rate among health professions trainees and inform researchers which strategies are associated with higher response rates. Journals publishing survey-based health professions education research should improve reporting of response rate, nonresponse bias, and other survey factors.
Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne
2018-01-01
Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries. PMID:29351349
NASA Astrophysics Data System (ADS)
Denli, H.; Huang, L.
2008-12-01
Quantitative monitoring of reservoir property changes is essential for safe geologic carbon sequestration. Time-lapse seismic surveys have the potential to effectively monitor fluid migration in the reservoir that causes geophysical property changes such as density, and P- and S-wave velocities. We introduce a novel method for quantitative estimation of seismic velocity changes using time-lapse seismic data. The method employs elastic sensitivity wavefields, which are the derivatives of elastic wavefield with respect to density, P- and S-wave velocities of a target region. We derive the elastic sensitivity equations from analytical differentiations of the elastic-wave equations with respect to seismic-wave velocities. The sensitivity equations are coupled with the wave equations in a way that elastic waves arriving in a target reservoir behave as a secondary source to sensitivity fields. We use a staggered-grid finite-difference scheme with perfectly-matched layers absorbing boundary conditions to simultaneously solve the elastic-wave equations and the elastic sensitivity equations. By elastic-wave sensitivities, a linear relationship between relative seismic velocity changes in the reservoir and time-lapse seismic data at receiver locations can be derived, which leads to an over-determined system of equations. We solve this system of equations using a least- square method for each receiver to obtain P- and S-wave velocity changes. We validate the method using both surface and VSP synthetic time-lapse seismic data for a multi-layered model and the elastic Marmousi model. Then we apply it to the time-lapse field VSP data acquired at the Aneth oil field in Utah. A total of 10.5K tons of CO2 was injected into the oil reservoir between the two VSP surveys for enhanced oil recovery. The synthetic and field data studies show that our new method can quantitatively estimate changes in seismic velocities within a reservoir due to CO2 injection/migration.
Middleton, A; Bragin, E; Morley, K I; Parker, M
2014-03-01
How can a researcher engage a participant in a survey, when the subject matter may be perceived as 'challenging' or even be totally unfamiliar to the participant? The Genomethics study addressed this via the creation and delivery of a novel online questionnaire containing 10 integrated films. The films documented various ethical dilemmas raised by genomic technologies and the survey ascertained attitudes towards these. Participants were recruited into the research using social media, traditional media and email invitation. The film-survey strategy was successful: 11,336 initial hits on the survey website led to 6944 completed surveys. Participants included from those who knew nothing of the subject matter through to experts in the field of genomics (61% compliance rate), 72% of participants answered every single question. This paper summarises the survey design process and validation methods applied. The recruitment strategy and results from the survey are presented elsewhere. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Middleton, A.; Bragin, E.; Morley, K.I.; Parker, M
2014-01-01
How can a researcher engage a participant in a survey, when the subject matter may be perceived as ‘challenging’ or even be totally unfamiliar to the participant? The Genomethics study addressed this via the creation and delivery of a novel online questionnaire containing 10 integrated films. The films documented various ethical dilemmas raised by genomic technologies and the survey ascertained attitudes towards these. Participants were recruited into the research using social media, traditional media and email invitation. The film-survey strategy was successful: 11,336 initial hits on the survey website led to 6944 completed surveys. Participants included from those who knew nothing of the subject matter through to experts in the field of genomics (61% compliance rate), 72% of participants answered every single question. This paper summarises the survey design process and validation methods applied. The recruitment strategy and results from the survey are presented elsewhere. PMID:24468445
NASA Astrophysics Data System (ADS)
DuVal, C.; Trembanis, A. C.; Miller, J. K.; Carton, G.
2016-12-01
Munitions and Explosives of Concern (MEC) have been acknowledged globally as a topic of concern. Increasing use of coastal and continental shelf environments for renewable energy development and other activities has and continues to place humans in contact with legacy military munitions. The Bureau of Ocean Energy Management (BOEM) recognized the need to develop guidance concerning methods for MEC detection in the case of offshore energy development. The study was designed to identify the most likely MEC to be encountered in the Atlantic Outer Continental Shelf (OCS) Wind Energy Areas (WEA), review available technologies and develop a process for selecting appropriate technologies and methodologies for their detection. The process for selecting and optimizing technologies and methods for detection of MEC in BOEM OCS WEAs was developed and tested through the synthesis of historical research, physical site characterization, remote sensing technology review, and in-field trials. To test the selected approach, designated personnel were tasked with seeding a portion of the Delaware WEA with munitions surrogates, while a second group of researchers not privy to the surrogate locations, tested and optimized the selected methodology. The effectiveness of a methodology will be related to ease of detection and other associated parameters. The approach for the in-field trial consists of a combination of wide-area assessment surveying by vessel mounted 230/550 kHz Edgetech 6205 Phase Measuring sonar and near-seafloor surveying using a Teledyne Gavia autonomous underwater vehicle (AUV) equipped with high-resolution 900/1800 kHz Marine Sonics side-scan sonar, Geometrics G880-AUV cesium-vapor magnetometer, and 2 megapixel Point Grey color camera. Survey parameters (e.g. track-line spacing, coverage overlap, AUV altitude) were varied to determine the optimal survey methods, as well as simulate MEC burial to test magnetometer range performance. Preliminary results indicate the combination of high-resolution, near-bed side-scan sonar and magnetometry yields promising results for MEC identification, addressing the potential for both surficial and buried MEC.
An Efficient Acoustic Density Estimation Method with Human Detectors Applied to Gibbons in Cambodia.
Kidney, Darren; Rawson, Benjamin M; Borchers, David L; Stevenson, Ben C; Marques, Tiago A; Thomas, Len
2016-01-01
Some animal species are hard to see but easy to hear. Standard visual methods for estimating population density for such species are often ineffective or inefficient, but methods based on passive acoustics show more promise. We develop spatially explicit capture-recapture (SECR) methods for territorial vocalising species, in which humans act as an acoustic detector array. We use SECR and estimated bearing data from a single-occasion acoustic survey of a gibbon population in northeastern Cambodia to estimate the density of calling groups. The properties of the estimator are assessed using a simulation study, in which a variety of survey designs are also investigated. We then present a new form of the SECR likelihood for multi-occasion data which accounts for the stochastic availability of animals. In the context of gibbon surveys this allows model-based estimation of the proportion of groups that produce territorial vocalisations on a given day, thereby enabling the density of groups, instead of the density of calling groups, to be estimated. We illustrate the performance of this new estimator by simulation. We show that it is possible to estimate density reliably from human acoustic detections of visually cryptic species using SECR methods. For gibbon surveys we also show that incorporating observers' estimates of bearings to detected groups substantially improves estimator performance. Using the new form of the SECR likelihood we demonstrate that estimates of availability, in addition to population density and detection function parameters, can be obtained from multi-occasion data, and that the detection function parameters are not confounded with the availability parameter. This acoustic SECR method provides a means of obtaining reliable density estimates for territorial vocalising species. It is also efficient in terms of data requirements since since it only requires routine survey data. We anticipate that the low-tech field requirements will make this method an attractive option in many situations where populations can be surveyed acoustically by humans.
NASA Astrophysics Data System (ADS)
Bernard, J.
2012-12-01
The Manufacturers of geophysical instruments have been facing these past decades the fast evolution of the electronics and of the computer sciences. More automatisms have been introduced into the equipment and into the processing and interpretation software which may let believe that conducting geophysical surveys requires less understanding of the method and less experience than in the past. Hence some misunderstandings in the skills that are needed to make the geophysical results well integrated among the global information which the applied geologist needs to acquire to be successful in his applications. Globally, the demand in geophysical investigation goes towards more penetration depth, requiring more powerful transmitters, and towards a better resolution, requiring more data such as in 3D analysis. Budgets aspects strongly suggest a high efficiency in the field associated to high speed data processing. The innovation is required in all aspects of geophysics to fit with the market needs, including new technological (instruments, software) and methodological (methods, procedures, arrays) developments. The structures in charge of the geophysical work can be public organisations (institutes, ministries, geological surveys,…) or can come from the private sector (large companies, sub-contractors, consultants, …), each one of them getting their own constraints in the field work and in the processing and interpretation phases. In the applications concerning Groundwater investigations, Mining Exploration, Environmental and Engineering surveys, examples of data and their interpretation presently carried out all around the world will be presented for DC Resistivity (Vertical Electrical Sounding, 2D, 3D Resistivity Imaging, Resistivity Monitoring), Induced Polarisation (Time Domain 2D, 3D arrays for mining and environmental), Magnetic Resonance Sounding (direct detection and characterisation of groundwater) and Electromagnetic (multi-component and multi-spacing Frequency Domain Sounding and Profiling technique). The place that Geophysics takes in the market among the other investigation techniques is, and will remain, dependant on the quality of the results obtained, despite the uncertainties linked to the field (noise aspects) and to the interpretation (equivalence aspects), under the control of budget decisions.Resistivity Imaging measurements for groundwater investigations
Deep learning for galaxy surface brightness profile fitting
NASA Astrophysics Data System (ADS)
Tuccillo, D.; Huertas-Company, M.; Decencière, E.; Velasco-Forero, S.; Domínguez Sánchez, H.; Dimauro, P.
2018-03-01
Numerous ongoing and future large area surveys (e.g. Dark Energy Survey, EUCLID, Large Synoptic Survey Telescope, Wide Field Infrared Survey Telescope) will increase by several orders of magnitude the volume of data that can be exploited for galaxy morphology studies. The full potential of these surveys can be unlocked only with the development of automated, fast, and reliable analysis methods. In this paper, we present DeepLeGATo, a new method for 2-D photometric galaxy profile modelling, based on convolutional neural networks. Our code is trained and validated on analytic profiles (HST/CANDELS F160W filter) and it is able to retrieve the full set of parameters of one-component Sérsic models: total magnitude, effective radius, Sérsic index, and axis ratio. We show detailed comparisons between our code and GALFIT. On simulated data, our method is more accurate than GALFIT and ˜3000 time faster on GPU (˜50 times when running on the same CPU). On real data, DeepLeGATo trained on simulations behaves similarly to GALFIT on isolated galaxies. With a fast domain adaptation step made with the 0.1-0.8 per cent the size of the training set, our code is easily capable to reproduce the results obtained with GALFIT even on crowded regions. DeepLeGATo does not require any human intervention beyond the training step, rendering it much automated than traditional profiling methods. The development of this method for more complex models (two-component galaxies, variable point spread function, dense sky regions) could constitute a fundamental tool in the era of big data in astronomy.
Liu, Jin-xinp; Lu, Heng; Zeng, Yan; Yue, Jian-wei; Meng, Fan-yun; Zhang, Yi-guang
2012-09-01
Resources survey of traditional Chinese medicine and reserves estimation are found to be the most important issues for the protection and utilization of traditional Chinese medicine resources, this paper used multi-spatial resolution remote sensing images (RS) , geographic information systems (GIS) and global positioning system (GPS) , to establish Scutellaria resources survey of 3S data platform. Combined with the traditional field survey methods, small-scale habitat types were established based on different skullcap reserve estimation model, which can estimate reserves of the wild Scutellaria in Beijing-Tianjin-Hebei region and improve the estimation accuracy. It can provide an important parameter for the fourth national survey of traditional Chinese medicine resources and traditional Chinese medicine reserves estimates based on 3S technology by multiple spatial scales model.
[Key content and formulation of national Chinese materia medica resources survey at county level].
Lu, Jian-Wei; Zhang, Xiao-Bo; Li, Hai-Tao; Guo, Lan-Ping; Zhao, Run-Huai; Zhang, Ben-Gang; Sun, Li-Ying; Huang, Lu-Qi
2013-08-01
According to National Census for Water, National Population Census, National Land and Resources Survey, and work experience of experimental measures for national Chinese materia medica resources(CMMR) survey,the national CMMR survey at the county level is the key point of whole survey, that includes organization and management, field survey, sorting data three key links. Organization and management works of national CMMR survey needs to finish four key contents, there are definite goals and tasks, practicable crew, preparation directory, and security assurance. Field survey works of the national CMMR survey needs to finish five key contents, there are preparation works for field survey, the choice of the key survey area (samples), fill in the questionnaire, video data collection, specimen and other physical collection. Sorting data works of the national CMMR survey needs to finish tree key contents, there are data, specimen and census results.
Avasthi, Ajit; Basu, Debasish; Subodh, B. N.; Gupta, Pramod K.; Malhotra, Nidhi; Rani, Poonam; Sharma, Sunil
2017-01-01
Background: Substance misuse is a matter of major public health concern in India. House-to-house survey, though an appealing method to generate population-level estimates, has limitations for estimating prevalence rates of use of illicit and rare substances. Materials and Methods: In this rapid assessment survey (RAS), respondent-driven sampling was used to recruit substance-using individuals from the field. Size of the substance-using population was estimated using the “benchmark-multiplier” method. This figure was then projected to the entire population of the Union Territory (U.T) of Chandigarh. Focused group discussions were used to study the perceptions and views of the substance users regarding various aspects of substance use. Results: Prevalence of any substance dependence in the U.T of Chandigarh was estimated to be 4.65%. Dependence rates on opioids, cannabinoids, and sedative hypnotics were found to be 1.53%, 0.52%, and 0.015%, respectively. Prevalence of injectable opioids was calculated to be 0.91%. Injectable buprenorphine was the most commonly used opioid, followed by bhukhi/doda/opium and heroin. A huge gap was found between the prevalence rates of substance-using population and those seeking treatment. Conclusion: RAS can be a useful method to determine the prevalence of illicit and rare substances. Our survey shows that the use of substance including that of opioids is highly prevalent in the U.T of Chandigarh. The findings of this survey can have implications for policymaking. PMID:29085086
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brodie, Jean P.; Romanowsky, Aaron J.; Jennings, Zachary G.
2014-11-20
We introduce and provide the scientific motivation for a wide-field photometric and spectroscopic chemodynamical survey of nearby early-type galaxies (ETGs) and their globular cluster (GC) systems. The SAGES Legacy Unifying Globulars and GalaxieS (SLUGGS) survey is being carried out primarily with Subaru/Suprime-Cam and Keck/DEIMOS. The former provides deep gri imaging over a 900 arcmin{sup 2} field-of-view to characterize GC and host galaxy colors and spatial distributions, and to identify spectroscopic targets. The NIR Ca II triplet provides GC line-of-sight velocities and metallicities out to typically ∼8 R {sub e}, and to ∼15 R {sub e} in some cases. New techniquesmore » to extract integrated stellar kinematics and metallicities to large radii (∼2-3 R {sub e}) are used in concert with GC data to create two-dimensional (2D) velocity and metallicity maps for comparison with simulations of galaxy formation. The advantages of SLUGGS compared with other, complementary, 2D-chemodynamical surveys are its superior velocity resolution, radial extent, and multiple halo tracers. We describe the sample of 25 nearby ETGs, the selection criteria for galaxies and GCs, the observing strategies, the data reduction techniques, and modeling methods. The survey observations are nearly complete and more than 30 papers have so far been published using SLUGGS data. Here we summarize some initial results, including signatures of two-phase galaxy assembly, evidence for GC metallicity bimodality, and a novel framework for the formation of extended star clusters and ultracompact dwarfs. An integrated overview of current chemodynamical constraints on GC systems points to separate, in situ formation modes at high redshifts for metal-poor and metal-rich GCs.« less
Laser Scanning in Engineering Surveying: Methods of Measurement and Modeling of Structures
NASA Astrophysics Data System (ADS)
Lenda, Grzegorz; Uznański, Andrzej; Strach, Michał; Lewińska, Paulina
2016-06-01
The study is devoted to the uses of laser scanning in the field of engineering surveying. It is currently one of the main trends of research which is developed at the Department of Engineering Surveying and Civil Engineering at the Faculty of Mining Surveying and Environmental Engineering of AGH University of Science and Technology in Krakow. They mainly relate to the issues associated with tower and shell structures, infrastructure of rail routes, or development of digital elevation models for a wide range of applications. These issues often require the use of a variety of scanning techniques (stationary, mobile), but the differences also regard the planning of measurement stations and methods of merging point clouds. Significant differences appear during the analysis of point clouds, especially when modeling objects. Analysis of the selected parameters is already possible basing on ad hoc measurements carried out on a point cloud. However, only the construction of three-dimensional models provides complete information about the shape of structures, allows to perform the analysis in any place and reduces the amount of the stored data. Some structures can be modeled in the form of simple axes, sections, or solids, for others it becomes necessary to create sophisticated models of surfaces, depicting local deformations. The examples selected for the study allow to assess the scope of measurement and office work for a variety of uses related to the issue set forth in the title of this study. Additionally, the latest, forward-looking technology was presented - laser scanning performed from Unmanned Aerial Vehicles (drones). Currently, it is basically in the prototype phase, but it might be expected to make a significant progress in numerous applications in the field of engineering surveying.
NASA Astrophysics Data System (ADS)
Raveloson, Andrea; Székely, Balázs; Molnár, Gábor; Rasztovits, Sascha
2013-04-01
Gully erosion is a worldwide problem for it has a number of undesirable effects and their development is hard to follow. Madagascar is one of the most affected countries for its highlands are densely covered with gullies named lavakas. Lavaka formation and development seems to be triggered by many regional and local causes but the actual reasons are still poorly understood. Furthermore lavakas differ from normal gullies due to their enormous size and special shape. Field surveys are time consuming and data from two-dimensional measurements and pictures (even aerial) might lack major information for morphologic studies. Therefore close range surveying technologies should be used to get three-dimensional information about these unusual and complex features. This contribution discusses which remote sensing and photogrammetric techniques are adequate to survey the development of lavakas, their volume change and sediment budget. Depending on the types and properties (such as volume, depth, shape, vegetation) of the lavaka different methods will be proposed showing pros and cons of each one of them. Our goal is to review techniques to model, survey and analyze lavakas development to better understand the cause of their formation, special size and shape. Different methods are evaluated and compared from field survey through data processing, analyzing cost-effectiveness, potential errors and accuracy for each one of them. For this purpose we will also consider time- and cost-effectiveness of the softwares able to render the images into 3D model as well as the resolution and accuracy of the outputs. Further studies will concentrate on using the three dimensional models of lavakas which will be later on used for geomorphological studies in order to understand their special shape and size. This is ILARG-contribution #07.
EVALUATION OF METRIC PRECISION FOR A RIPARIAN FOREST SURVEY
This paper evaluates the performance of a protocol to monitor riparian forests in western Oregon based on the quality of the data obtained from a recent field survey. Precision and accuracy are the criteria used to determine the quality of 19 field metrics. The field survey con...
77 FR 46375 - Notice of Intent to Seek Approval To Conduct an Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-03
... approval to conduct a new information collection, the 2013 Residue and Biomass Field Survey. DATES... may submit comments, identified by docket number 0535- NEW, 2013 Residue and Biomass Field Survey by.... Department of Agriculture, (202) 720-4333. SUPPLEMENTARY INFORMATION: Title: Residue and Biomass Field Survey...
A search for extended radio emission from selected compact galaxy groups
NASA Astrophysics Data System (ADS)
Nikiel-Wroczyński, B.; Urbanik, M.; Soida, M.; Beck, R.; Bomans, D. J.
2017-07-01
Context. Studies on compact galaxy groups have led to the conclusion that a plenitude of phenomena take place in between galaxies that form them. However, radio data on these objects are extremely scarce and not much is known concerning the existence and role of the magnetic field in intergalactic space. Aims: We aim to study a small sample of galaxy groups that look promising as possible sources of intergalactic magnetic fields; for example data from radio surveys suggest that most of the radio emission is due to extended, diffuse structures in and out of the galaxies. Methods: We used the Effelsberg 100 m radio telescope at 4.85 GHz and NRAO VLA Sky Survey (NVSS) data at 1.40 GHz. After subtraction of compact sources we analysed the maps searching for diffuse, intergalactic radio emission. Spectral index and magnetic field properties were derived. Results: Intergalactic magnetic fields exist in groups HCG 15 and HCG 60, whereas there are no signs of them in HCG 68. There are also hints of an intergalactic bridge in HCG 44 at 4.85 GHz. Conclusions: Intergalactic magnetic fields exist in galaxy groups and their energy density may be comparable to the thermal (X-ray) density, suggesting an important role of the magnetic field in the intra-group medium, wherever it is detected.
Perspective: Ab initio force field methods derived from quantum mechanics
NASA Astrophysics Data System (ADS)
Xu, Peng; Guidez, Emilie B.; Bertoni, Colleen; Gordon, Mark S.
2018-03-01
It is often desirable to accurately and efficiently model the behavior of large molecular systems in the condensed phase (thousands to tens of thousands of atoms) over long time scales (from nanoseconds to milliseconds). In these cases, ab initio methods are difficult due to the increasing computational cost with the number of electrons. A more computationally attractive alternative is to perform the simulations at the atomic level using a parameterized function to model the electronic energy. Many empirical force fields have been developed for this purpose. However, the functions that are used to model interatomic and intermolecular interactions contain many fitted parameters obtained from selected model systems, and such classical force fields cannot properly simulate important electronic effects. Furthermore, while such force fields are computationally affordable, they are not reliable when applied to systems that differ significantly from those used in their parameterization. They also cannot provide the information necessary to analyze the interactions that occur in the system, making the systematic improvement of the functional forms that are used difficult. Ab initio force field methods aim to combine the merits of both types of methods. The ideal ab initio force fields are built on first principles and require no fitted parameters. Ab initio force field methods surveyed in this perspective are based on fragmentation approaches and intermolecular perturbation theory. This perspective summarizes their theoretical foundation, key components in their formulation, and discusses key aspects of these methods such as accuracy and formal computational cost. The ab initio force fields considered here were developed for different targets, and this perspective also aims to provide a balanced presentation of their strengths and shortcomings. Finally, this perspective suggests some future directions for this actively developing area.
NASA Technical Reports Server (NTRS)
Gardner, J. P.; Straughn, Amber N.; Meurer, Gerhardt R.; Pirzkal, Norbert; Cohen, Seth H.; Malhotra, Sangeeta; Rhoads, james; Windhorst, Rogier A.; Gardner, Jonathan P.; Hathi, Nimish P.;
2007-01-01
The Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS) grism PEARS (Probing Evolution And Reionization Spectroscopically) survey provides a large dataset of low-resolution spectra from thousands of galaxies in the GOODS North and South fields. One important subset of objects in these data are emission-line galaxies (ELGs), and we have investigated several different methods aimed at systematically selecting these galaxies. Here we present a new methodology and results of a search for these ELGs in the PEARS observations of the Hubble Ultra Deep Field (HUDF) using a 2D detection method that utilizes the observation that many emission lines originate from clumpy knots within galaxies. This 2D line-finding method proves to be useful in detecting emission lines from compact knots within galaxies that might not otherwise be detected using more traditional 1D line-finding techniques. We find in total 96 emission lines in the HUDF, originating from 81 distinct "knots" within 63 individual galaxies. We find in general that [0 1111 emitters are the most common, comprising 44% of the sample, and on average have high equivalent widths (70% of [0 1111 emitters having rest-frame EW> 100A). There are 12 galaxies with multiple emitting knots; several show evidence of variations in H-alpha flux in the knots, suggesting that the differing star formation properties across a single galaxy can in general be probed at redshifts approximately greater than 0.2 - 0.4. The most prevalent morphologies are large face-on spirals and clumpy interacting systems, many being unique detections owing to the 2D method described here, thus highlighting the strength of this technique.
Phelps, Geoffrey; Cronkite-Ratcliff, Collin; Blake, Kelly
2018-04-19
We have conducted a gravity survey of the Coso geothermal field to continue the time-lapse gravity study of the area initiated in 1991. In this report, we outline a method of processing the gravity data that minimizes the random errors and instrument bias introduced into the data by the Scintrex CG-5 relative gravimeters that were used. After processing, the standard deviation of the data was estimated to be ±13 microGals. These data reveal that the negative gravity anomaly over the Coso geothermal field, centered on gravity station CER1, is continuing to increase in magnitude over time. Preliminary modeling indicates that water-table drawdown at the location of CER1 is between 65 and 326 meters over the last two decades. We note, however, that several assumptions on which the model results depend, such as constant elevation and free-water level over the study period, still require verification.
Updates to Post-Flash Calibration for the Advanced Camera for Surveys Wide Field Channel
NASA Astrophysics Data System (ADS)
Miles, Nathan
2018-03-01
This report presents a new technique for generating the post-flash calibration reference file for the Advanced Camera for Surveys (ACS) Wide Field Channel (WFC). The new method substantially reduces, if not, eliminates all together the presence of dark current artifacts arising from improper dark subtraction, while simultaneously preserving flat-field artifacts. The stability of the post-flash calibration reference file over time is measured using data taken yearly since 2012 and no statistically significant deviations are found. An analysis of all short-flashed darks taken every two days since January 2015 reveals a periodic modulation of the LED intensity on timescales of about one year. This effect is most readily explained by changes to the local temperature in the area surrounding the LED. However, a slight offset between the periods of the temperature and LED modulations lends to the possibility that the effect is a chance observation of the two sinusoids at an unfortunate point in their beat cycle.
NEOWISE Reactivation Mission Year Three: Asteroid Diameters and Albedos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masiero, Joseph R.; Mainzer, A. K.; Kramer, E.
The Near-Earth Object Wide-field Infrared Survey Explorer (NEOWISE) reactivation mission has completed its third year of surveying the sky in the thermal infrared for near-Earth asteroids and comets. NEOWISE collects simultaneous observations at 3.4 and 4.6 μ m of solar system objects passing through its field of regard. These data allow for the determination of total thermal emission from bodies in the inner solar system, and thus the sizes of these objects. In this paper, we present thermal model fits of asteroid diameters for 170 NEOs and 6110 Main Belt asteroids (MBAs) detected during the third year of the survey,more » as well as the associated optical geometric albedos. We compare our results with previous thermal model results from NEOWISE for overlapping sample sets, as well as diameters determined through other independent methods, and find that our diameter measurements for NEOs agree to within 26% (1 σ ) of previously measured values. Diameters for the MBAs are within 17% (1 σ ). This brings the total number of unique near-Earth objects characterized by the NEOWISE survey to 541, surpassing the number observed during the fully cryogenic mission in 2010.« less
The Next Generation Transit Survey (NGTS)
NASA Astrophysics Data System (ADS)
Wheatley, Peter J.; West, Richard G.; Goad, Michael R.; Jenkins, James S.; Pollacco, Don L.; Queloz, Didier; Rauer, Heike; Udry, Stéphane; Watson, Christopher A.; Chazelas, Bruno; Eigmüller, Philipp; Lambert, Gregory; Genolet, Ludovic; McCormac, James; Walker, Simon; Armstrong, David J.; Bayliss, Daniel; Bento, Joao; Bouchy, François; Burleigh, Matthew R.; Cabrera, Juan; Casewell, Sarah L.; Chaushev, Alexander; Chote, Paul; Csizmadia, Szilárd; Erikson, Anders; Faedi, Francesca; Foxell, Emma; Gänsicke, Boris T.; Gillen, Edward; Grange, Andrew; Günther, Maximilian N.; Hodgkin, Simon T.; Jackman, James; Jordán, Andrés; Louden, Tom; Metrailler, Lionel; Moyano, Maximiliano; Nielsen, Louise D.; Osborn, Hugh P.; Poppenhaeger, Katja; Raddi, Roberto; Raynard, Liam; Smith, Alexis M. S.; Soto, Maritza; Titz-Weider, Ruth
2018-04-01
We describe the Next Generation Transit Survey (NGTS), which is a ground-based project searching for transiting exoplanets orbiting bright stars. NGTS builds on the legacy of previous surveys, most notably WASP, and is designed to achieve higher photometric precision and hence find smaller planets than have previously been detected from the ground. It also operates in red light, maximizing sensitivity to late K and early M dwarf stars. The survey specifications call for photometric precision of 0.1 per cent in red light over an instantaneous field of view of 100 deg2, enabling the detection of Neptune-sized exoplanets around Sun-like stars and super-Earths around M dwarfs. The survey is carried out with a purpose-built facility at Cerro Paranal, Chile, which is the premier site of the European Southern Observatory (ESO). An array of twelve 20 cm f/2.8 telescopes fitted with back-illuminated deep-depletion CCD cameras is used to survey fields intensively at intermediate Galactic latitudes. The instrument is also ideally suited to ground-based photometric follow-up of exoplanet candidates from space telescopes such as TESS, Gaia and PLATO. We present observations that combine precise autoguiding and the superb observing conditions at Paranal to provide routine photometric precision of 0.1 per cent in 1 h for stars with I-band magnitudes brighter than 13. We describe the instrument and data analysis methods as well as the status of the survey, which achieved first light in 2015 and began full-survey operations in 2016. NGTS data will be made publicly available through the ESO archive.
Airport geomagnetic surveys in the United States
Berarducci, A.
2006-01-01
The Federal Aviation Administration (FAA) and the United States military have requirements for design, location, and construction of compass calibration pads (compass roses), these having been developed through collaboration with US Geological Survey (USGS) personnel. These requirements are detailed in the FAA Advisory Circular AC 150/5300-13, Appendix 4, and in various military documents, such as Handbook 1021/1, but the major requirement is that the range of declination measured within 75 meters of the center of a compass rose be less than or equal to 30 minutes of arc. The USGS Geomagnetism Group has developed specific methods for conducting a magnetic survey so that existing compass roses can be judged in terms of the needed standards and also that new sites can be evaluated for their suitability as potentially new compass roses. First, a preliminary survey is performed with a total-field magnetometer, with differences over the site area of less than 75nT being sufficient to warrant additional, more detailed surveying. Next, a number of survey points are established over the compass rose area and nearby, where declination is to be measured with an instrument capable of measuring declination to within 1 minute of arc, such as a Gurley transit magnetometer, DI Flux theodolite magnetometer, or Wild T-0. The data are corrected for diurnal and irregular effects of the magnetic field and declination is determined for each survey point, as well as declination range and average of the entire compass rose site. Altogether, a typical survey takes about four days to complete. ?? 2006 Springer.
Information and informatics literacies of first-year medical students
Bouquin, Daina R.; Tmanova, Lyubov L.; Wright, Drew
2015-01-01
Purpose The study evaluated medical students' familiarity with information literacy and informatics during the health sciences library orientation. Methods A survey was fielded at the start of the 2013 school year. Results Seventy-two of 77 students (94%) completed the survey. Over one-half (57%) expected to use library research materials and services. About half (43%) expected to use library physical space. Students preferred accessing biomedical research on laptops and learning via online-asynchronous modes. Conclusions The library identified areas for service development and outreach to medical students and academic departments. PMID:26512221
Radiological Defense. Volume 4. An Introduction to Radiological Instruments for Military Use
1950-01-01
alpha admission. Thin mica and stretched nylon walls with nonmetals of low atomic number and and rubber hydrochloride films about 5 microns writing...individual pulse. This process is called qu •,cIh1ng and can be ac- R, 5ME.OH, , • .. complished by using external electronic methods ’-""::_:-:Ž-:- " or...counter tubes field survey work. Mica windows down to 0.5 )resentliy employed in survey meters are " inch in mg/cm 2 are available, although in normal
NASA Astrophysics Data System (ADS)
Maki, Toshihiro; Ura, Tamaki; Singh, Hanumant; Sakamaki, Takashi
Large-area seafloor imaging will bring significant benefits to various fields such as academics, resource survey, marine development, security, and search-and-rescue. The authors have proposed a navigation method of an autonomous underwater vehicle for seafloor imaging, and verified its performance through mapping tubeworm colonies with the area of 3,000 square meters using the AUV Tri-Dog 1 at Tagiri vent field, Kagoshima bay in Japan (Maki et al., 2008, 2009). This paper proposes a post-processing method to build a natural photo mosaic from a number of pictures taken by an underwater platform. The method firstly removes lens distortion, invariances of color and lighting from each image, and then ortho-rectification is performed based on camera pose and seafloor estimated by navigation data. The image alignment is based on both navigation data and visual characteristics, implemented as an expansion of the image based method (Pizarro et al., 2003). Using the two types of information realizes an image alignment that is consistent both globally and locally, as well as making the method applicable to data sets with little visual keys. The method was evaluated using a data set obtained by the AUV Tri-Dog 1 at the vent field in Sep. 2009. A seamless, uniformly illuminated photo mosaic covering the area of around 500 square meters was created from 391 pictures, which covers unique features of the field such as bacteria mats and tubeworm colonies.
Geophysical investigation using gravity data in Kinigi geothermal field, northwest Rwanda
NASA Astrophysics Data System (ADS)
Uwiduhaye, Jean d.'Amour; Mizunaga, Hideki; Saibi, Hakim
2018-03-01
A land gravity survey was carried out in the Kinigi geothermal field, Northwest Rwanda using 184 gravity stations during August and September, 2015. The aim of the gravity survey was to understand the subsurface structure and its relation to the observed surface manifestations in the study area. The complete Bouguer Gravity anomaly was produced with a reduction density of 2.4 g/cm3. Bouguer anomalies ranging from -52 to -35 mGals were observed in the study area with relatively high anomalies in the east and northwest zones while low anomalies are observed in the southwest side of the studied area. A decrease of 17 mGals is observed in the southwestern part of the study area and caused by the low-density of the Tertiary rocks. Horizontal gradient, tilt angle and analytical signal methods were applied to the observed gravity data and showed that Mubona, Mpenge and Cyabararika surface springs are structurally controlled while Rubindi spring is not. The integrated results of gravity gradient interpretation methods delineated a dominant geological structure trending in the NW-SE, which is in agreement with the regional geological trend. The results of this gravity study will help aid future geothermal exploration and development in the Kinigi geothermal field.
Cervantes, Jose A; Costello, Collin M; Maarouf, Melody; McCrary, Hilary C; Zeitouni, Nathalie C
2017-09-01
A realistic model for the instruction of basic dermatologic procedural skills was developed, while simultaneously increasing medical student exposure to the field of dermatology. The primary purpose of the authors' study was to evaluate the utilization of a fresh-tissue cadaver model (FTCM) as a method for the instruction of common dermatologic procedures. The authors' secondary aim was to assess students' perceived clinical skills and overall perception of the field of dermatology after the lab. Nineteen first- and second-year medical students were pre- and post-tested on their ability to perform punch and excisional biopsies on a fresh-tissue cadaver. Students were then surveyed on their experience. Assessment of the cognitive knowledge gain and technical skills revealed a statistically significant improvement in all categories (p < .001). An analysis of the survey demonstrated that 78.9% were more interested in selecting dermatology as a career and 63.2% of participants were more likely to refer their future patients to a Mohs surgeon. An FTCM is a viable method for the instruction and training of dermatologic procedures. In addition, the authors conclude that an FTCM provides realistic instruction for common dermatologic procedures and enhances medical students' early exposure and interest in the field of dermatology.
Ntonifor, N N; Ngufor, C A; Kimbi, H K; Oben, B O
2006-10-01
To document and test the efficacy of indigenous traditional personal protection methods against mosquito bites and general nuisance. A prospective study based on a survey and field evaluation of selected plant-based personal protection methods against mosquito bites. Bolifamba, a rural setting of the Mount Cameroon region. A structured questionnaire was administered to 179 respondents and two anti-mosquito measures were tested under field conditions. Identified traditional anti-mosquito methods used by indigenes of Bolifamba. Two plants tested under field conditions were found to be effective. Of the 179 respondents, 88 (49.16%) used traditional anti-mosquito methods; 57 (64.77%) used plant-based methods while 31 (35.2%) used various petroleum oils. The rest of the respondents, 91 (50.8%) used conventional personal protection methods. Reasons for using traditional methods were because they were available, affordable and lack of known more effective alternatives. The demerits of these methods were: labourious to implement, stain dresses, produce a lot of smoke/ repulsive odours when used; those of conventional methods were lack of adequate information about them, high cost and non-availability. When the two most frequently used plants, Saccharum officinarium and Ocimum basilicum were evaluated under field conditions, each gave a better protection than the control. Most plants used against mosquitoes in the area are known potent mosquito repellents but others identified in the study warrant further research. The two tested under field conditions were effective though less than the commonly used commercial diethyltoluamide.
NASA Astrophysics Data System (ADS)
Ding, J.; Wang, G.; Xiong, L.; Zhou, X.; England, E.
2017-12-01
Coastal regions are naturally vulnerable to impact from long-term coastal erosion and episodic coastal hazards caused by extreme weather events. Major geomorphic changes can occur within a few hours during storms. Prediction of storm impact, costal planning and resilience observation after natural events all require accurate and up-to-date topographic maps of coastal morphology. Thus, the ability to conduct rapid and high-resolution-high-accuracy topographic mapping is of critical importance for long-term coastal management and rapid response after natural hazard events. Terrestrial laser scanning (TLS) techniques have been frequently applied to beach and dune erosion studies and post hazard responses. However, TLS surveying is relatively slow and costly for rapid surveying. Furthermore, TLS surveying unavoidably retains gray areas that cannot be reached by laser pulses, particularly in wetland areas where lack of direct access in most cases. Aerial mapping using photogrammetry from images taken by unmanned aerial vehicles (UAV) has become a new technique for rapid topographic mapping. UAV photogrammetry mapping techniques provide the ability to map coastal features quickly, safely, inexpensively, on short notice and with minimal impact. The primary products from photogrammetry are point clouds similar to the LiDAR point clouds. However, a large number of ground control points (ground truth) are essential for obtaining high-accuracy UAV maps. The ground control points are often obtained by GPS survey simultaneously with the TLS survey in the field. The GPS survey could be a slow and arduous process in the field. This study aims to develop methods for acquiring a huge number of ground control points from TLS survey and validating point clouds obtained from photogrammetry with the TLS point clouds. A Rigel VZ-2000 TLS scanner was used for developing laser point clouds and a DJI Phantom 4 Pro UAV was used for acquiring images. The aerial images were processed with the Photogrammetry mapping software Agisoft PhotoScan. A workflow for conducting rapid TLS and UAV survey in the field and integrating point clouds obtained from TLS and UAV surveying will be introduced. Key words: UAV photogrammetry, ground control points, TLS, coastal morphology, topographic mapping
47 CFR 5.87 - Frequencies for field strength surveys or equipment demonstrations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 1 2012-10-01 2012-10-01 false Frequencies for field strength surveys or... EXPERIMENTAL RADIO SERVICE (OTHER THAN BROADCAST) Applications and Licenses § 5.87 Frequencies for field strength surveys or equipment demonstrations. (a) Authorizations issued under §§ 5.3 (e) and (f) of this...
47 CFR 5.87 - Frequencies for field strength surveys or equipment demonstrations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Frequencies for field strength surveys or... EXPERIMENTAL RADIO SERVICE (OTHER THAN BROADCAST) Applications and Licenses § 5.87 Frequencies for field strength surveys or equipment demonstrations. (a) Authorizations issued under §§ 5.3 (e) and (f) of this...
47 CFR 5.87 - Frequencies for field strength surveys or equipment demonstrations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Frequencies for field strength surveys or... EXPERIMENTAL RADIO SERVICE (OTHER THAN BROADCAST) Applications and Licenses § 5.87 Frequencies for field strength surveys or equipment demonstrations. (a) Authorizations issued under §§ 5.3 (e) and (f) of this...
Xu, Tao; Ye, Shaoqing; Shi, Yu; Xu, Tongtong; Wang, Qi; Li, Si; Zhang, Youruo; Lan, Yi; Zhao, Ling
2017-08-12
To understand the knowledge of acupuncture disease spectrum, expectation of acupuncture treatment in Chengdu residents. A questionnaire regarding the knowledge of acupuncture disease spectrum and expectation of acupuncture treatment in Chengdu residents was established. By field sampling and internet survey, related data were collected and analyzed. Totally 1 548 valid questionnaires were collected, including 1 041 from field survey and 507 from internet survey. The results indicated the knowledge of acupuncture and its disease spectrum were moderate in Chengdu residents; among the disease spectrum of acupuncture, the disease with the highest cognition was insomnia, accounting for 45.0% in field survey and 75.4% in internet survey; while the disease with the lowest cognition was infertility, accounting for 8.3% in field survey and 34.3% in internet survey. The knowledge of acupuncture in Chengdu residents could be further improved, and the promotion of popular science of acupuncture should be strengthened in future.
NASA Astrophysics Data System (ADS)
Takemine, S.; Rikimaru, A.; Takahashi, K.
The rice is one of the staple foods in the world High quality rice production requires periodically collecting rice growth data to control the growth of rice The height of plant the number of stem the color of leaf is well known parameters to indicate rice growth Rice growth diagnosis method based on these parameters is used operationally in Japan although collecting these parameters by field survey needs a lot of labor and time Recently a laborsaving method for rice growth diagnosis is proposed which is based on vegetation cover rate of rice Vegetation cover rate of rice is calculated based on discriminating rice plant areas in a digital camera image which is photographed in nadir direction Discrimination of rice plant areas in the image was done by the automatic binarization processing However in the case of vegetation cover rate calculation method depending on the automatic binarization process there is a possibility to decrease vegetation cover rate against growth of rice In this paper a calculation method of vegetation cover rate was proposed which based on the automatic binarization process and referred to the growth hysteresis information For several images obtained by field survey during rice growing season vegetation cover rate was calculated by the conventional automatic binarization processing and the proposed method respectively And vegetation cover rate of both methods was compared with reference value obtained by visual interpretation As a result of comparison the accuracy of discriminating rice plant areas was increased by the proposed
General introduction for the “National Field Manual for the Collection of Water-Quality Data”
,
2018-02-28
BackgroundAs part of its mission, the U.S. Geological Survey (USGS) collects data to assess the quality of our Nation’s water resources. A high degree of reliability and standardization of these data are paramount to fulfilling this mission. Documentation of nationally accepted methods used by USGS personnel serves to maintain consistency and technical quality in data-collection activities. “The National Field Manual for the Collection of Water-Quality Data” (NFM) provides documented guidelines and protocols for USGS field personnel who collect water-quality data. The NFM provides detailed, comprehensive, and citable procedures for monitoring the quality of surface water and groundwater. Topics in the NFM include (1) methods and protocols for sampling water resources, (2) methods for processing samples for analysis of water quality, (3) methods for measuring field parameters, and (4) specialized procedures, such as sampling water for low levels of mercury and organic wastewater chemicals, measuring biological indicators, and sampling bottom sediment for chemistry. Personnel who collect water-quality data for national USGS programs and projects, including projects supported by USGS cooperative programs, are mandated to use protocols provided in the NFM per USGS Office of Water Quality Technical Memorandum 2002.13. Formal training, for example, as provided in the USGS class, “Field Water-Quality Methods for Groundwater and Surface Water,” and field apprenticeships supplement the guidance provided in the NFM and ensure that the data collected are high quality, accurate, and scientifically defensible.
Source field effects in the auroral zone: Evidence from the Slave craton (NW Canada)
NASA Astrophysics Data System (ADS)
Lezaeta, Pamela; Chave, Alan; Jones, Alan G.; Evans, Rob
2007-09-01
We present an investigation of source field effects on the magnetic fields from multiple long period magnetotelluric (MT) data collected on the floors of lakes throughout the Slave craton (NW Canada) from 1998 to 2000. Monthly and daily power spectra of the magnetic fields suggest a dynamic and seasonally varying source, with atypical geomagnetic activity in year 2000. Bounded influence MT and GDS responses were obtained for periods ranging between 80 and 25,000 s over selected monthly time segments. The responses at periods over 4000 s vary, suggesting source field effects. A frequency domain principal component (PC) method was applied to the array to investigate the spatial form of the source field variations. The PC analysis was tested with synthetic data from a regional 3D model with a uniform external source to study the sensitivity of the eigenvectors to conductivity structure, demonstrating a negligible influence with increasing penetration depth. We conclude that magnetic fields at periods near one half day are subject to a 1D polarized source of relatively homogeneous morphology over the survey area during any month recorded, except for the summer month of July 2000 that had particularly high geomagnetic activity. In general, the source space approaches two polarizations at periods below one half day, with the dominant NS component seen quasi-homogeneous over the survey area at periods over 1000 s.
Development and Validation of Environmental DNA (eDNA) Markers for Detection of Freshwater Turtles.
Davy, Christina M; Kidd, Anne G; Wilson, Chris C
2015-01-01
Environmental DNA (eDNA) is a potentially powerful tool for detection and monitoring of rare species, including threatened native species and recently arrived invasive species. Here, we develop DNA primers for a suite of nine sympatric freshwater turtles, and use it to test whether turtle eDNA can be successfully detected in samples from aquaria and an outdoor pond. We also conduct a cost comparison between eDNA detection and detection through traditional survey methods, using data from field surveys at two sites in our target area. We find that eDNA from turtles can be detected using both conventional polymerase chain reaction (PCR) and quantitative PCR (qPCR), and that the cost of detection through traditional survey methods is 2-10X higher than eDNA detection for the species in our study range. We summarize necessary future steps for application of eDNA surveys to turtle monitoring and conservation and propose specific cases in which the application of eDNA could further the conservation of threatened turtle species.
Characterizing unknown systematics in large scale structure surveys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agarwal, Nishant; Ho, Shirley; Myers, Adam D.
Photometric large scale structure (LSS) surveys probe the largest volumes in the Universe, but are inevitably limited by systematic uncertainties. Imperfect photometric calibration leads to biases in our measurements of the density fields of LSS tracers such as galaxies and quasars, and as a result in cosmological parameter estimation. Earlier studies have proposed using cross-correlations between different redshift slices or cross-correlations between different surveys to reduce the effects of such systematics. In this paper we develop a method to characterize unknown systematics. We demonstrate that while we do not have sufficient information to correct for unknown systematics in the data,more » we can obtain an estimate of their magnitude. We define a parameter to estimate contamination from unknown systematics using cross-correlations between different redshift slices and propose discarding bins in the angular power spectrum that lie outside a certain contamination tolerance level. We show that this method improves estimates of the bias using simulated data and further apply it to photometric luminous red galaxies in the Sloan Digital Sky Survey as a case study.« less
Development and Validation of Environmental DNA (eDNA) Markers for Detection of Freshwater Turtles
Davy, Christina M.; Kidd, Anne G.; Wilson, Chris C.
2015-01-01
Environmental DNA (eDNA) is a potentially powerful tool for detection and monitoring of rare species, including threatened native species and recently arrived invasive species. Here, we develop DNA primers for a suite of nine sympatric freshwater turtles, and use it to test whether turtle eDNA can be successfully detected in samples from aquaria and an outdoor pond. We also conduct a cost comparison between eDNA detection and detection through traditional survey methods, using data from field surveys at two sites in our target area. We find that eDNA from turtles can be detected using both conventional polymerase chain reaction (PCR) and quantitative PCR (qPCR), and that the cost of detection through traditional survey methods is 2–10X higher than eDNA detection for the species in our study range. We summarize necessary future steps for application of eDNA surveys to turtle monitoring and conservation and propose specific cases in which the application of eDNA could further the conservation of threatened turtle species. PMID:26200348
The Feasibility of E-Learning Implementation in an Iranian University
ERIC Educational Resources Information Center
Mirzamohammadi, M. H.
2017-01-01
The present research aimed to investigate the feasibility of e-learning implementation in an Iranian comprehensive university (included medical and non-medical fields) to provide appropriate solutions in this regard. To achieve this objective, seven research questions were formed. Surveying method was applied for data collection in this study.…
Integrating Research Proposals into a Field Seminar: Feedback from Students
ERIC Educational Resources Information Center
Kvarfordt, Connie L.; Carter, Irene; Park, Wansoo; Yun, Sung Hyun
2014-01-01
Teaching research to social work students continues to be the topic of an ongoing dialog within the profession. The literature suggests that some form of a "real-world" or "hands-on" learning opportunity has many benefits for teaching a subject that students often are reluctant to engage in. Using online survey methods,…
Statistical Inference and Patterns of Inequality in the Global North
ERIC Educational Resources Information Center
Moran, Timothy Patrick
2006-01-01
Cross-national inequality trends have historically been a crucial field of inquiry across the social sciences, and new methodological techniques of statistical inference have recently improved the ability to analyze these trends over time. This paper applies Monte Carlo, bootstrap inference methods to the income surveys of the Luxembourg Income…
On the Measurement and Properties of Ambiguity in Probabilistic Expectations
ERIC Educational Resources Information Center
Pickett, Justin T.; Loughran, Thomas A.; Bushway, Shawn
2015-01-01
Survey respondents' probabilistic expectations are now widely used in many fields to study risk perceptions, decision-making processes, and behavior. Researchers have developed several methods to account for the fact that the probability of an event may be more ambiguous for some respondents than others, but few prior studies have empirically…
Student and Teacher Perspectives of Technology Usage
ERIC Educational Resources Information Center
Elliott, Lori L.
2010-01-01
This study examined the use of technology by eighth grade students and teachers and perceptions of students and teachers toward technology use in the classroom and home. A mixed design method was selected to collect and analyze the data. Face-to-face interviews, field notes, and national survey results were used to triangulate the data. Three…
Computer Software: Does It Support a New View of Reading?
ERIC Educational Resources Information Center
Case, Carolyn J.
A study examined commercially available computer software to ascertain its degree of congruency with current methods of reading instruction (the Interactive model) at the first and second grade levels. A survey was conducted of public school educators in Connecticut and experts in the field to determine their level of satisfaction with available…
A call for a new speciality: Forensic odontology as a subject
Wadhwan, Vijay; Shetty, Devi Charan; Jain, Anshi; Khanna, Kaveri Surya; Gupta, Amit
2014-01-01
Background: Forensic science is defined as a discipline concerned with the application of science and technology to the detection and investigation of crime and administration of justice, requiring the coordinated efforts of a multidisciplinary team. Dental identification remains one of the most reliable and frequently applied methods of identification. Hence, it can be defined as the science that deals with evidence from the dental and oral structures and is a specialty in itself. Objectives: To analyze the level of awareness of Forensic Odontology amongst the individuals from the field of dentistry with the help of a survey. Materials and Methods: A questionnaire was prepared and a survey was conducted with a sample size of 200 divided in four groups. Results: Revealed inadequate knowledge, poor attitude, and lack of practice of forensic odontology prevailing among the dentists. Conclusion: Our study reflects the current situation of our country in the field of forensic odontology, which could be improved by introducing forensic odontology as a subject in the dental curriculum at both the undergraduate and the post-graduate levels. PMID:25125916
Quantitative Laughter Detection, Measurement, and Classification-A Critical Survey.
Cosentino, Sarah; Sessa, Salvatore; Takanishi, Atsuo
2016-01-01
The study of human nonverbal social behaviors has taken a more quantitative and computational approach in recent years due to the development of smart interfaces and virtual agents or robots able to interact socially. One of the most interesting nonverbal social behaviors, producing a characteristic vocal signal, is laughing. Laughter is produced in several different situations: in response to external physical, cognitive, or emotional stimuli; to negotiate social interactions; and also, pathologically, as a consequence of neural damage. For this reason, laughter has attracted researchers from many disciplines. A consequence of this multidisciplinarity is the absence of a holistic vision of this complex behavior: the methods of analysis and classification of laughter, as well as the terminology used, are heterogeneous; the findings sometimes contradictory and poorly documented. This survey aims at collecting and presenting objective measurement methods and results from a variety of different studies in different fields, to contribute to build a unified model and taxonomy of laughter. This could be successfully used for advances in several fields, from artificial intelligence and human-robot interaction to medicine and psychiatry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanc, Guillermo A.; Weinzirl, Tim; Song, Mimi
2013-05-15
We present the survey design, data reduction, and spectral fitting pipeline for the VIRUS-P Exploration of Nearby Galaxies (VENGA). VENGA is an integral field spectroscopic survey, which maps the disks of 30 nearby spiral galaxies. Targets span a wide range in Hubble type, star formation activity, morphology, and inclination. The VENGA data cubes have 5.''6 FWHM spatial resolution, {approx}5 A FWHM spectral resolution, sample the 3600 A-6800 A range, and cover large areas typically sampling galaxies out to {approx}0.7R{sub 25}. These data cubes can be used to produce two-dimensional maps of the star formation rate, dust extinction, electron density, stellarmore » population parameters, the kinematics and chemical abundances of both stars and ionized gas, and other physical quantities derived from the fitting of the stellar spectrum and the measurement of nebular emission lines. To exemplify our methods and the quality of the data, we present the VENGA data cube on the face-on Sc galaxy NGC 628 (a.k.a. M 74). The VENGA observations of NGC 628 are described, as well as the construction of the data cube, our spectral fitting method, and the fitting of the stellar and ionized gas velocity fields. We also propose a new method to measure the inclination of nearly face-on systems based on the matching of the stellar and gas rotation curves using asymmetric drift corrections. VENGA will measure relevant physical parameters across different environments within these galaxies, allowing a series of studies on star formation, structure assembly, stellar populations, chemical evolution, galactic feedback, nuclear activity, and the properties of the interstellar medium in massive disk galaxies.« less
Li, Peipei; Zhao, Zhenjun; Wang, Ying; Xing, Hua; Parker, Daniel M; Yang, Zhaoqing; Baum, Elizabeth; Li, Wenli; Sattabongkot, Jetsumon; Sirichaisinthop, Jeeraphat; Li, Shuying; Yan, Guiyun; Cui, Liwang; Fan, Qi
2014-05-08
Nested PCR is considered a sensitive and specific method for detecting malaria parasites and is especially useful in epidemiological surveys. However, the preparation of DNA templates for PCR is often time-consuming and costly. A simplified PCR method was developed to directly use a small blood filter paper square (2 × 2 mm) as the DNA template after treatment with saponin. This filter paper-based nested PCR method (FP-PCR) was compared to microscopy and standard nested PCR with DNA extracted by using a Qiagen DNA mini kit from filter paper blood spots of 204 febrile cases. The FP-PCR technique was further applied to evaluate malaria infections in 1,708 participants from cross-sectional epidemiological surveys conducted in Myanmar and Thailand. The FP-PCR method had a detection limit of ~0.2 parasites/μL blood, estimated using cultured Plasmodium falciparum parasites. With 204 field samples, the sensitivity of the FP-PCR method was comparable to that of the standard nested PCR method, which was significantly higher than that of microscopy. Application of the FP-PCR method in large cross-sectional studies conducted in Myanmar and Thailand detected 1.9% (12/638) and 6.2% (66/1,070) asymptomatic Plasmodium infections, respectively, as compared to the detection rates of 1.3% (8/638) and 0.04% (4/1,070) by microscopy. This FP-PCR method was much more sensitive than microscopy in detecting Plasmodium infections. It drastically increased the detection sensitivity of asymptomatic infections in cross-sectional surveys conducted in Thailand and Myanmar, suggesting that this FP-PCR method has a potential for future applications in malaria epidemiology studies.
NASA Technical Reports Server (NTRS)
Rogallo, Vernon L; Yaggy, Paul F; Mccloud, John L , III
1956-01-01
A simplified procedure is shown for calculating the once-per-revolution oscillating aerodynamic thrust loads on propellers of tractor airplanes at zero yaw. The only flow field information required for the application of the procedure is a knowledge of the upflow angles at the horizontal center line of the propeller disk. Methods are presented whereby these angles may be computed without recourse to experimental survey of the flow field. The loads computed by the simplified procedure are compared with those computed by a more rigorous method and the procedure is applied to several airplane configurations which are believed typical of current designs. The results are generally satisfactory.
The magnetic fields of hot subdwarf stars
NASA Astrophysics Data System (ADS)
Landstreet, J. D.; Bagnulo, S.; Fossati, L.; Jordan, S.; O'Toole, S. J.
2012-05-01
Context. Detection of magnetic fields has been reported in several sdO and sdB stars. Recent literature has cast doubts on the reliability of most of these detections. The situation concerning the occurrence and frequency of magnetic fields in hot subdwarfs is at best confused. Aims: We revisit data previously published in the literature, and we present new observations to clarify the question of how common magnetic fields are in subdwarf stars. Methods: We consider a sample of about 40 hot subdwarf stars. About 30 of them have been observed with the FORS1 and FORS2 instruments of the ESO VLT. Results have been published for only about half of the hot subdwarfs observed with FORS. Here we present new FORS1 field measurements for 17 stars, 14 of which have never been observed for magnetic fields before. We also critically review the measurements already published in the literature, and in particular we try to explain why previous papers based on the same FORS1 data have reported contradictory results. Results: All new and re-reduced measurements obtained with FORS1 are shown to be consistent with non-detection of magnetic fields. We explain previous spurious field detections from data obtained with FORS1 as due to a non-optimal method of wavelength calibration. Field detections in other surveys are found to be uncertain or doubtful, and certainly in need of confirmation. Conclusions: There is presently no strong evidence for the occurrence of a magnetic field in any sdB or sdO star, with typical longitudinal field uncertainties of the order of 2-400 G. It appears that globally simple fields of more than about 1 or 2 kG in strength occur in at most a few percent of hot subdwarfs. Further high-precision surveys, both with high-resolution spectropolarimeters and with instruments similar to FORS1 on large telescopes, would be very valuable. Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere, Chile under observing programmes 072.D-0290 and 075.D-0352, or obtained from the ESO/ST-ECF Science Archive Facility.
Survey of meshless and generalized finite element methods: A unified approach
NASA Astrophysics Data System (ADS)
Babuška, Ivo; Banerjee, Uday; Osborn, John E.
In the past few years meshless methods for numerically solving partial differential equations have come into the focus of interest, especially in the engineering community. This class of methods was essentially stimulated by difficulties related to mesh generation. Mesh generation is delicate in many situations, for instance, when the domain has complicated geometry; when the mesh changes with time, as in crack propagation, and remeshing is required at each time step; when a Lagrangian formulation is employed, especially with nonlinear PDEs. In addition, the need for flexibility in the selection of approximating functions (e.g., the flexibility to use non-polynomial approximating functions), has played a significant role in the development of meshless methods. There are many recent papers, and two books, on meshless methods; most of them are of an engineering character, without any mathematical analysis.In this paper we address meshless methods and the closely related generalized finite element methods for solving linear elliptic equations, using variational principles. We give a unified mathematical theory with proofs, briefly address implementational aspects, present illustrative numerical examples, and provide a list of references to the current literature.The aim of the paper is to provide a survey of a part of this new field, with emphasis on mathematics. We present proofs of essential theorems because we feel these proofs are essential for the understanding of the mathematical aspects of meshless methods, which has approximation theory as a major ingredient. As always, any new field is stimulated by and related to older ideas. This will be visible in our paper.
A new method to measure galaxy bias by combining the density and weak lensing fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pujol, Arnau; Chang, Chihway; Gaztañaga, Enrique
We present a new method to measure redshift-dependent galaxy bias by combining information from the galaxy density field and the weak lensing field. This method is based on the work of Amara et al., who use the galaxy density field to construct a bias-weighted convergence field κg. The main difference between Amara et al.'s work and our new implementation is that here we present another way to measure galaxy bias, using tomography instead of bias parametrizations. The correlation between κg and the true lensing field κ allows us to measure galaxy bias using different zero-lag correlations, such as / ormore » /. Our method measures the linear bias factor on linear scales, under the assumption of no stochasticity between galaxies and matter. We use the Marenostrum Institut de Ciències de l'Espai (MICE) simulation to measure the linear galaxy bias for a flux-limited sample (i < 22.5) in tomographic redshift bins using this method. This article is the first that studies the accuracy and systematic uncertainties associated with the implementation of the method and the regime in which it is consistent with the linear galaxy bias defined by projected two-point correlation functions (2PCF). We find that our method is consistent with a linear bias at the per cent level for scales larger than 30 arcmin, while non-linearities appear at smaller scales. This measurement is a good complement to other measurements of bias, since it does not depend strongly on σ8 as do the 2PCF measurements. We will apply this method to the Dark Energy Survey Science Verification data in a follow-up article.« less
Surveying Europe's Only Cave-Dwelling Chordate Species (Proteus anguinus) Using Environmental DNA.
Vörös, Judit; Márton, Orsolya; Schmidt, Benedikt R; Gál, Júlia Tünde; Jelić, Dušan
2017-01-01
In surveillance of subterranean fauna, especially in the case of rare or elusive aquatic species, traditional techniques used for epigean species are often not feasible. We developed a non-invasive survey method based on environmental DNA (eDNA) to detect the presence of the red-listed cave-dwelling amphibian, Proteus anguinus, in the caves of the Dinaric Karst. We tested the method in fifteen caves in Croatia, from which the species was previously recorded or expected to occur. We successfully confirmed the presence of P. anguinus from ten caves and detected the species for the first time in five others. Using a hierarchical occupancy model we compared the availability and detection probability of eDNA of two water sampling methods, filtration and precipitation. The statistical analysis showed that both availability and detection probability depended on the method and estimates for both probabilities were higher using filter samples than for precipitation samples. Combining reliable field and laboratory methods with robust statistical modeling will give the best estimates of species occurrence.
Current State of Agile User-Centered Design: A Survey
NASA Astrophysics Data System (ADS)
Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas
Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.
The Vimos VLT deep survey. Global properties of 20,000 galaxies in the IAB < 22.5 WIDE survey
NASA Astrophysics Data System (ADS)
Garilli, B.; Le Fèvre, O.; Guzzo, L.; Maccagni, D.; Le Brun, V.; de la Torre, S.; Meneux, B.; Tresse, L.; Franzetti, P.; Zamorani, G.; Zanichelli, A.; Gregorini, L.; Vergani, D.; Bottini, D.; Scaramella, R.; Scodeggio, M.; Vettolani, G.; Adami, C.; Arnouts, S.; Bardelli, S.; Bolzonella, M.; Cappi, A.; Charlot, S.; Ciliegi, P.; Contini, T.; Foucaud, S.; Gavignaud, I.; Ilbert, O.; Iovino, A.; Lamareille, F.; McCracken, H. J.; Marano, B.; Marinoni, C.; Mazure, A.; Merighi, R.; Paltani, S.; Pellò, R.; Pollo, A.; Pozzetti, L.; Radovich, M.; Zucca, E.; Blaizot, J.; Bongiorno, A.; Cucciati, O.; Mellier, Y.; Moreau, C.; Paioro, L.
2008-08-01
The VVDS-Wide survey has been designed to trace the large-scale distribution of galaxies at z ~ 1 on comoving scales reaching ~100~h-1 Mpc, while providing a good control of cosmic variance over areas as large as a few square degrees. This is achieved by measuring redshifts with VIMOS at the ESO VLT to a limiting magnitude IAB = 22.5, targeting four independent fields with sizes of up to 4 deg2 each. We discuss the survey strategy which covers 8.6 deg2 and present the general properties of the current redshift sample. This includes 32 734 spectra in the four regions, covering a total area of 6.1 deg2 with a sampling rate of 22 to 24%. This paper accompanies the public release of the first 18 143 redshifts of the VVDS-Wide survey from the 4 deg2 contiguous area of the F22 field at RA = 22^h. We have devised and tested an objective method to assess the quality of each spectrum, providing a compact figure-of-merit. This is particularly effective in the case of long-lasting spectroscopic surveys with varying observing conditions. Our figure of merit is a measure of the robustness of the redshift measurement and, most importantly, can be used to select galaxies with uniform high-quality spectra to carry out reliable measurements of spectral features. We also use the data available over the four independent regions to directly measure the variance in galaxy counts. We compare it with general predictions from the observed galaxy two-point correlation function at different redshifts and with that measured in mock galaxy surveys built from the Millennium simulation. The purely magnitude-limited VVDS Wide sample includes 19 977 galaxies, 304 type I AGNs, and 9913 stars. The redshift success rate is above 90% independent of magnitude. A cone diagram of the galaxy spatial distribution provides us with the current largest overview of large-scale structure up to z ~ 1, showing a rich texture of over- and under-dense regions. We give the mean N(z) distribution averaged over 6.1 deg2 for a sample limited in magnitude to IAB = 22.5. Comparing galaxy densities from the four fields shows that in a redshift bin Δz = 0.1 at z ~ 1 one still has factor-of-two variations over areas as large as ~ 0.25 deg2. This level of cosmic variance agrees with that obtained by integrating the galaxy two-point correlation function estimated from the F22 field alone. It is also in fairly good statistical agreement with that predicted by the Millennium simulations. The VVDS WIDE survey currently provides the largest area coverage among redshift surveys reaching z ~ 1. The variance estimated over the survey fields shows explicitly how clustering results from deep surveys of even 1 deg2 size should be interpreted with caution. The survey data represent a rich data base to select complete sub-samples of high-quality spectra and to study galaxy ensemble properties and galaxy clustering over unprecedented scales at these redshifts. The redshift catalog of the 4 deg2 F22 field is publicly available at http://cencosw.oamp.fr.
A comparison of two methods for surveying mortality of beached birds in British Columbia.
Stephen, C; Burger, A E
1994-01-01
Systematic surveys of beached birds are often limited in their ability to classify the causes of death of the carcasses recovered. Two methods of determining the cause of death of seabirds encountered during surveys of beaches of southwestern Vancouver Island, British Columbia, are compared. Birds were either subjected to external visual examinations by volunteer beach surveyors or submitted for gross postmortem examination by a veterinarian. The reliance on external examination of birds on beaches often prevented the accurate classification of the reproductive status and cause of death of the birds collected, but was valuable for describing the species, locations, and numbers of birds affected. The use of gross postmortem examinations of carcasses allowed for a more refined classification of the cause of death, as well as providing reliable descriptions of the bodily condition and sex of the birds examined. However, almost one half of the carcasses encountered were unsuitable for necropsy because of scavenging and decomposition. It is concluded that a combination of field and necropsy observations provides a useful method with which to monitor the pattern of mortality of beached seabirds. PMID:7994705
A Search for Nontoroidal Topological Lensing in the Sloan Digital Sky Survey Quasar Catalog
NASA Astrophysics Data System (ADS)
Fujii, Hirokazu; Yoshii, Yuzuru
2013-08-01
Flat space models with multiply connected topology, which have compact dimensions, are tested against the distribution of high-redshift (z >= 4) quasars of the Sloan Digital Sky Survey (SDSS). When the compact dimensions are smaller in size than the observed universe, topological lensing occurs, in which multiple images of single objects (ghost images) are observed. We improve on the recently introduced method to identify ghost images by means of four-point statistics. Our method is valid for any of the 17 multiply connected flat models, including nontoroidal ones that are compacted by screw motions or glide reflection. Applying the method to the data revealed one possible case of topological lensing caused by sixth-turn screw motion, however, it is consistent with the simply connected model by this test alone. Moreover, simulations suggest that we cannot exclude the other space models despite the absence of their signatures. This uncertainty mainly originates from the patchy coverage of SDSS in the south Galactic cap, and this situation will be improved by future wide-field spectroscopic surveys.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang Yanxia; Ma He; Peng Nanbo
We apply one of the lazy learning methods, the k-nearest neighbor (kNN) algorithm, to estimate the photometric redshifts of quasars based on various data sets from the Sloan Digital Sky Survey (SDSS), the UKIRT Infrared Deep Sky Survey (UKIDSS), and the Wide-field Infrared Survey Explorer (WISE; the SDSS sample, the SDSS-UKIDSS sample, the SDSS-WISE sample, and the SDSS-UKIDSS-WISE sample). The influence of the k value and different input patterns on the performance of kNN is discussed. kNN performs best when k is different with a special input pattern for a special data set. The best result belongs to the SDSS-UKIDSS-WISEmore » sample. The experimental results generally show that the more information from more bands, the better performance of photometric redshift estimation with kNN. The results also demonstrate that kNN using multiband data can effectively solve the catastrophic failure of photometric redshift estimation, which is met by many machine learning methods. Compared with the performance of various other methods of estimating the photometric redshifts of quasars, kNN based on KD-Tree shows superiority, exhibiting the best accuracy.« less
Reconstructing the gravitational field of the local Universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desmond, Harry; Ferreira, Pedro G.; Lavaux, Guilhem
Tests of gravity at the galaxy scale are in their infancy. As a first step to systematically uncovering the gravitational significance of galaxies, we map three fundamental gravitational variables – the Newtonian potential, acceleration and curvature – over the galaxy environments of the local Universe to a distance of approximately 200 Mpc. Our method combines the contributions from galaxies in an all-sky redshift survey, haloes from an N-body simulation hosting low-luminosity objects, and linear and quasi-linear modes of the density field. We use the ranges of these variables to determine the extent to which galaxies expand the scope of genericmore » tests of gravity and are capable of constraining specific classes of model for which they have special significance. In conclusion, we investigate the improvements afforded by upcoming galaxy surveys.« less
Beyond δ: Tailoring marked statistics to reveal modified gravity
NASA Astrophysics Data System (ADS)
Valogiannis, Georgios; Bean, Rachel
2018-01-01
Models which attempt to explain the accelerated expansion of the universe through large-scale modifications to General Relativity (GR), must satisfy the stringent experimental constraints of GR in the solar system. Viable candidates invoke a “screening” mechanism, that dynamically suppresses deviations in high density environments, making their overall detection challenging even for ambitious future large-scale structure surveys. We present methods to efficiently simulate the non-linear properties of such theories, and consider how a series of statistics that reweight the density field to accentuate deviations from GR can be applied to enhance the overall signal-to-noise ratio in differentiating the models from GR. Our results demonstrate that the cosmic density field can yield additional, invaluable cosmological information, beyond the simple density power spectrum, that will enable surveys to more confidently discriminate between modified gravity models and ΛCDM.
Reconstructing the gravitational field of the local Universe
Desmond, Harry; Ferreira, Pedro G.; Lavaux, Guilhem; ...
2017-11-25
Tests of gravity at the galaxy scale are in their infancy. As a first step to systematically uncovering the gravitational significance of galaxies, we map three fundamental gravitational variables – the Newtonian potential, acceleration and curvature – over the galaxy environments of the local Universe to a distance of approximately 200 Mpc. Our method combines the contributions from galaxies in an all-sky redshift survey, haloes from an N-body simulation hosting low-luminosity objects, and linear and quasi-linear modes of the density field. We use the ranges of these variables to determine the extent to which galaxies expand the scope of genericmore » tests of gravity and are capable of constraining specific classes of model for which they have special significance. In conclusion, we investigate the improvements afforded by upcoming galaxy surveys.« less
Seismoelectric field measurements in unconsolidated sediments
NASA Astrophysics Data System (ADS)
Rabbel, Wolfgang; Iwanowski-Strahser, Katja; Strahser, Matthias; Dzieran, Laura; Thorwart, Martin
2017-04-01
Seismoelectric (SE) prospecting has the potential of determining hydraulic permeability in situ. However, the SE response of geological interfaces (IR) is influenced also by porosity, saturation and salinity. We present examples of SE surveys of near-surface unconsolidated sediments showing clear IR arrivals from the shallow groundwater table and laterally consistent IR arrivals from interfaces inside the vadoze zone. Theses measurements are complemented by seismic, GPR and geoelectric surveys for constraining bulk porosity, water saturation and salinity. They show that porosity and water content change at the interfaces generating IR arrivals. The combination of these methods enables us to estimate permeability contrast associated with major IR arrivals via numerical modeling of SE waveform amplitudes. In case of the analyzed field example this contrast is estimated to be of the order of 10 within the vadoze zone and of 100 at the aquifer-aquitard interface.
Closed-loop and robust control of quantum systems.
Chen, Chunlin; Wang, Lin-Cheng; Wang, Yuanlong
2013-01-01
For most practical quantum control systems, it is important and difficult to attain robustness and reliability due to unavoidable uncertainties in the system dynamics or models. Three kinds of typical approaches (e.g., closed-loop learning control, feedback control, and robust control) have been proved to be effective to solve these problems. This work presents a self-contained survey on the closed-loop and robust control of quantum systems, as well as a brief introduction to a selection of basic theories and methods in this research area, to provide interested readers with a general idea for further studies. In the area of closed-loop learning control of quantum systems, we survey and introduce such learning control methods as gradient-based methods, genetic algorithms (GA), and reinforcement learning (RL) methods from a unified point of view of exploring the quantum control landscapes. For the feedback control approach, the paper surveys three control strategies including Lyapunov control, measurement-based control, and coherent-feedback control. Then such topics in the field of quantum robust control as H(∞) control, sliding mode control, quantum risk-sensitive control, and quantum ensemble control are reviewed. The paper concludes with a perspective of future research directions that are likely to attract more attention.
Web-based flood database for Colorado, water years 1867 through 2011
Kohn, Michael S.; Jarrett, Robert D.; Krammes, Gary S.; Mommandi, Amanullah
2013-01-01
In order to provide a centralized repository of flood information for the State of Colorado, the U.S. Geological Survey, in cooperation with the Colorado Department of Transportation, created a Web-based geodatabase for flood information from water years 1867 through 2011 and data for paleofloods occurring in the past 5,000 to 10,000 years. The geodatabase was created using the Environmental Systems Research Institute ArcGIS JavaScript Application Programing Interface 3.2. The database can be accessed at http://cwscpublic2.cr.usgs.gov/projects/coflood/COFloodMap.html. Data on 6,767 flood events at 1,597 individual sites throughout Colorado were compiled to generate the flood database. The data sources of flood information are indirect discharge measurements that were stored in U.S. Geological Survey offices (water years 1867–2011), flood data from indirect discharge measurements referenced in U.S. Geological Survey reports (water years 1884–2011), paleoflood studies from six peer-reviewed journal articles (data on events occurring in the past 5,000 to 10,000 years), and the U.S. Geological Survey National Water Information System peak-discharge database (water years 1883–2010). A number of tests were performed on the flood database to ensure the quality of the data. The Web interface was programmed using the Environmental Systems Research Institute ArcGIS JavaScript Application Programing Interface 3.2, which allows for display, query, georeference, and export of the data in the flood database. The data fields in the flood database used to search and filter the database include hydrologic unit code, U.S. Geological Survey station number, site name, county, drainage area, elevation, data source, date of flood, peak discharge, and field method used to determine discharge. Additional data fields can be viewed and exported, but the data fields described above are the only ones that can be used for queries.
Bathymetric survey of water reservoirs in north-eastern Brazil based on TanDEM-X satellite data.
Zhang, Shuping; Foerster, Saskia; Medeiros, Pedro; de Araújo, José Carlos; Motagh, Mahdi; Waske, Bjoern
2016-11-15
Water scarcity in the dry season is a vital problem in dryland regions such as northeastern Brazil. Water supplies in these areas often come from numerous reservoirs of various sizes. However, inventory data for these reservoirs is often limited due to the expense and time required for their acquisition via field surveys, particularly in remote areas. Remote sensing techniques provide a valuable alternative to conventional reservoir bathymetric surveys for water resource management. In this study single pass TanDEM-X data acquired in bistatic mode were used to generate digital elevation models (DEMs) in the Madalena catchment, northeastern Brazil. Validation with differential global positioning system (DGPS) data from field measurements indicated an absolute elevation accuracy of approximately 1m for the TanDEM-X derived DEMs (TDX DEMs). The DEMs derived from TanDEM-X data acquired at low water levels show significant advantages over bathymetric maps derived from field survey, particularly with regard to coverage, evenly distributed measurements and replication of reservoir shape. Furthermore, by mapping the dry reservoir bottoms with TanDEM-X data, TDX DEMs are free of emergent and submerged macrophytes, independent of water depth (e.g. >10m), water quality and even weather conditions. Thus, the method is superior to other existing bathymetric mapping approaches, particularly for inland water bodies. The proposed approach relies on (nearly) dry reservoir conditions at times of image acquisition and is thus restricted to areas that show considerable water levels variations. However, comparisons between TDX DEM and the bathymetric map derived from field surveys show that the amount of water retained during the dry phase has only marginal impact on the total water volume derivation from TDX DEM. Overall, DEMs generated from bistatic TanDEM-X data acquired in low water periods constitute a useful and efficient data source for deriving reservoir bathymetry and show great potential in large scale application. Copyright © 2016 Elsevier B.V. All rights reserved.
An Exploration Geophysics Course With an Environmental Focus for an Urban Minority Institution
NASA Astrophysics Data System (ADS)
Kenyon, P. M.
2004-12-01
A hands-on exploration geophysics field course with an environmental focus has been developed with NSF support for use at the City College of New York in Manhattan. To maximize access for the students, no prerequisites beyond introductory earth science and physics are required. The course is taught for three hours on Saturday mornings. This has resulted in it attracting not only regular City College students, but also earth science teachers studying for alternate certification or Master's degrees. After a brief introduction to the nature of geophysics and to concepts in data processing, the course is taught in four three-week modules, one each on seismology, resistivity surveying, electromagnetic ground conductivity, and magnetic measurements. Each module contains one week of theory, a field experience, computer data analysis, and a final report. Field exercises are planned to emphasize teamwork and include realistic urban applications of the techniques. Student surveys done in conjunction with this course provide insights into the motivations and needs of the mostly minority students taking it. In general, these students come to the course already comfortable with teamwork and with working in the field. The questionnaires indicate that their greatest need is increased knowledge of the methods of geophysics and of the problems that can be attacked using it. Most of the students gave high ratings to the course, citing the fieldwork as the part that they most enjoyed. The results of these surveys will be presented, along with examples of the field exercises used. The computer analysis assignments written for this course will also be available.
Ellefsen, K.J.; Burton, B.L.; Lucius, J.E.; Haines, S.S.; Fitterman, D.V.; Witty, J.A.; Carlson, D.; Milburn, B.; Langer, W.H.
2007-01-01
Personnel from the U.S. Geological Survey and Martin Marietta Aggregates, Inc., conducted field demonstrations of five different geophysical methods to show how these methods could be used to characterize deposits of alluvial aggregate. The methods were time-domain electromagnetic sounding, electrical resistivity profiling, S-wave reflection profiling, S-wave refraction profiling, and P-wave refraction profiling. All demonstrations were conducted at one site within a river valley in central Indiana, where the stratigraphy consisted of 1 to 2 meters of clay-rich soil, 20 to 35 meters of alluvial sand and gravel, 1 to 6 meters of clay, and multiple layers of limestone and dolomite bedrock. All geophysical methods, except time-domain electromagnetic sounding, provided information about the alluvial aggregate that was consistent with the known geology. Although time-domain electromagnetic sounding did not work well at this site, it has worked well at other sites with different geology. All of these geophysical methods complement traditional methods of geologic characterization such as drilling.
Benchmark study on glyphosate-resistant crop systems in the United States. Part 2: Perspectives.
Owen, Micheal D K; Young, Bryan G; Shaw, David R; Wilson, Robert G; Jordan, David L; Dixon, Philip M; Weller, Stephen C
2011-07-01
A six-state, 5 year field project was initiated in 2006 to study weed management methods that foster the sustainability of genetically engineered (GE) glyphosate-resistant (GR) crop systems. The benchmark study field-scale experiments were initiated following a survey, conducted in the winter of 2005-2006, of farmer opinions on weed management practices and their views on GR weeds and management tactics. The main survey findings supported the premise that growers were generally less aware of the significance of evolved herbicide resistance and did not have a high recognition of the strong selection pressure from herbicides on the evolution of herbicide-resistant (HR) weeds. The results of the benchmark study survey indicated that there are educational challenges to implement sustainable GR-based crop systems and helped guide the development of the field-scale benchmark study. Paramount is the need to develop consistent and clearly articulated science-based management recommendations that enable farmers to reduce the potential for HR weeds. This paper provides background perspectives about the use of GR crops, the impact of these crops and an overview of different opinions about the use of GR crops on agriculture and society, as well as defining how the benchmark study will address these issues. Copyright © 2011 Society of Chemical Industry.
Black carbon emissions from biomass and coal in rural China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Weishi; Lu, Zifeng; Xu, Yuan
Residential solid fuel combustion makes a major contribution to black carbon (BC) emissions in China. A new estimation of BC emissions from rural solid biomass and coal consumption has been derived from field survey data. The following new contributions are made: (1) emission factors are collected and reviewed; (2) household energy data are collected from field survey data and from the literature; (3) a new extrapolation method is developed to extend the field survey data to other locations; (4) the ownership and usage of two stove types are estimated and considered in the emission calculations; and (5) uncertainties associated withmore » the estimation results are quantified. It is shown that rural households with higher income will consume less biomass but more coal. Agricultural acreage and temperature also significantly influence the amount of solid fuel consumed in rural areas. It is estimated that 640±245 Gg BC/y were emitted to the atmosphere due to residential solid fuel consumption in rural China in 2014. Emissions of BC from straw, wood, and coal contributed 42±13%, 36±15%, and 22±10% of the total, respectively. We show that effective BC mitigation (a reduction of 47%) could be obtained through widespread introduction of improved stoves in rural households« less
Grech, James; Robertson, James; Thomas, Jackson; Cooper, Gabrielle; Naunton, Mark; Kelly, Tamsin
2018-01-05
For decades, thousands of people have been dying from malaria infections because of poor-quality medicines (PQMs). While numerous efforts have been initiated to reduce their presence, PQMs are still risking the lives of those seeking treatment. This review addresses the importance of characterising results of antimalarial medicine field surveys based upon the agreement of clearly defined definitions. Medicines found to be of poor quality can be falsified or counterfeit, substandard or degraded. The distinction between these categories is important as each category requires a different countermeasure. To observe the current trends in the reporting of field surveys, a systematic literature search of six academic databases resulted in the quantitative analysis of 61 full-text journal articles. Information including sample size, sampling method, geographical regions, analytical techniques, and characterisation conclusions was observed for each. The lack of an accepted uniform reporting system has resulted in varying, incomplete reports, which may not include important information that helps form effective countermeasures. The programmes influencing medicine quality such as prequalification, procurement services, awareness and education can be supported with the information derived from characterised results. The implementation of checklists such as the Medicine Quality Assessment Reporting Guidelines will further strengthen the battle against poor-quality antimalarials. Copyright © 2017 Elsevier B.V. All rights reserved.
Black carbon emissions from biomass and coal in rural China
NASA Astrophysics Data System (ADS)
Zhang, Weishi; Lu, Zifeng; Xu, Yuan; Wang, Can; Gu, Yefu; Xu, Hui; Streets, David G.
2018-03-01
Residential solid fuel combustion makes a major contribution to black carbon (BC) emissions in China. A new estimation of BC emissions from rural solid biomass and coal consumption has been derived from field survey data. The following new contributions are made: (1) emission factors are collected and reviewed; (2) household energy data are collected from field survey data and from the literature; (3) a new extrapolation method is developed to extend the field survey data to other locations; (4) the ownership and usage of two stove types are estimated and considered in the emission calculations; and (5) uncertainties associated with the estimation results are quantified. It is shown that rural households with higher income will consume less biomass but more coal. Agricultural acreage and temperature also significantly influence the amount of solid fuel consumed in rural areas. It is estimated that 640 ± 245 Gg BC/y were emitted to the atmosphere due to residential solid fuel consumption in rural China in 2014. Emissions of BC from straw, wood, and coal contributed 42 ± 13%, 36 ± 15%, and 22 ± 10% of the total, respectively. We show that effective BC mitigation (a reduction of 47%) could be obtained through widespread introduction of improved stoves in rural households.
Response of six neutron survey meters in mixed fields of fast and thermal neutrons.
Kim, S I; Kim, B H; Chang, I; Lee, J I; Kim, J L; Pradhan, A S
2013-10-01
Calibration neutron fields have been developed at KAERI (Korea Atomic Energy Research Institute) to study the responses of commonly used neutron survey meters in the presence of fast neutrons of energy around 10 MeV. The neutron fields were produced by using neutrons from the (241)Am-Be sources held in a graphite pile and a DT neutron generator. The spectral details and the ambient dose equivalent rates of the calibration fields were established, and the responses of six neutron survey meters were evaluated. Four single-moderator-based survey meters exhibited an under-responses ranging from ∼9 to 55 %. DINEUTRUN, commonly used in fields around nuclear reactors, exhibited an over-response by a factor of three in the thermal neutron field and an under-response of ∼85 % in the mixed fields. REM-500 (tissue-equivalent proportional counter) exhibited a response close to 1.0 in the fast neutron fields and an under-response of ∼50 % in the thermal neutron field.
Techniques employed for detection of hot particles in the marine environment.
Pillsbury, G D
2007-09-01
During the decommissioning of the Maine Yankee nuclear plant, several methods were developed and employed to survey for hot particles in the marine environment surrounding the site. The methods used and the sensitivities achieved in the search for environmentally dispersed particles during the various decommissioning activities performed are described in detail. Surveys were performed on dry soil, exposed marine sediment and submerged marine sediment. Survey techniques ranged from the use of the basic NaI detector coupled to a count rate meter to an intrinsic germanium detector deployed in a submarine housing coupled to a multi-channel analyser. The initial surveys consisted of collecting samples of marine sediment, spreading them out over a 1 m2 surface in a thin layer, and scanning the deposited sediment by hand using a 5 cm by 5 cm NaI detector coupled to a standard count rate meter. This technique was later replaced by walkover scans with the 5 cm by 5 cm NaI detector moved in a serpentine pattern over the sediment surface. By coupling the detector to a 'smart meter', an alarm set point could be used to alert the surveyor to the presence of a particle within the instrument's field of view. A similar technique, with the detector mounted in a watertight housing secured to the end of a pole, was also employed to scan underwater locations. The most sensitive method developed for performing underwater surveys was the use of the intrinsic germanium detector placed in a submarine housing. Detailed descriptions of the methods employed and the results obtained are presented. This work demonstrates that there are several approaches to surveying for discrete particles in the marine environment and the relative merits of each are considered.
Fluvial sediment fingerprinting: literature review and annotated bibliography
Williamson, Joyce E.; Haj, Adel E.; Stamm, John F.; Valder, Joshua F.; Prautzch, Vicki L.
2014-01-01
The U.S. Geological Survey has evaluated and adopted various field methods for collecting real-time sediment and nutrient data. These methods have proven to be valuable representations of sediment and nutrient concentrations and loads but are not able to accurately identify specific source areas. Recently, more advanced data collection and analysis techniques have been evaluated that show promise in identifying specific source areas. Application of field methods could include studies of sources of fluvial sediment, otherwise referred to as sediment “fingerprinting.” The identification of sediment is important, in part, because knowing the primary sediment source areas in watersheds ensures that best management practices are incorporated in areas that maximize reductions in sediment loadings. This report provides a literature review and annotated bibliography of existing methodologies applied in the field of fluvial sediment fingerprinting. This literature review provides a bibliography of publications where sediment fingerprinting methods have been used; however, this report is not assumed to provide an exhaustive listing. Selected publications were categorized by methodology with some additional summary information. The information contained in the summary may help researchers select methods better suited to their particular study or study area, and identify methods in need of more testing and application.
NASA Astrophysics Data System (ADS)
Benni, P.
2017-06-01
(Abstract only) GPX is designed to search high density star fields where other surveys, such as WASP, HATNet, XO, and KELT would find challenging due to blending of transit like events. Using readily available amateur equipment, a survey telescope (Celestron RASA, 279 mm f/2.2, based in Acton, Massachusetts) was configured first with a SBIG ST-8300M camera then later upgraded to an FLI ML16200 camera and tested under different sampling scenarios with multiple image fields to obtain a 9- to 11-minute cadence per field. The resultant image resolution of GPX is about 2 arcsec/pixel compared to 13.7±23 arcsec/pixel of the aforementioned surveys and the future TESS space telescope exoplanet survey.
Application of geoelectric methods for man-caused gas deposit mapping and monitoring
NASA Astrophysics Data System (ADS)
Yakymchuk, M. A.; Levashov, S. P.; Korchagin, I. N.; Syniuk, B. B.
2009-04-01
The rather new application of original geoelectric methods of forming of short-pulsed electromagnetic field (FSPEF) and vertical electric-resonance sounding (VERS) (FSPEF-VERS technology) (Levashov et al., 2003; 2004) is discussed. In 2008 the FSPEF-VERS methods were used for ascertaining the reasons of serious man-caused accident on gas field. The emission of water with gas has occurred near an operational well on one gas field. The assumption was discussed, that some part of gas from producing horizons has got into the upper horizons, in aquiferous stratum layers. It promoted creation of superfluous pressure in aquiferous stratums which has led to accident on the field. Operative geophysical investigations within an accident site were carried out by FSPEF and VERS geoelectric methods on 07.10.08 and 13.10.08 on the first stage. The primary goal of executed works was detection and mapping of gas penetration zones in aquiferous stratums of cross-section upper part, and also the determination of bedding depths and a total area of distribution of gas in upper aquiferous stratums. The anomalous zone were revealed and mapped by FSPEF survey. It is caused by raised migration of water in upper horizons of a cross-section. The depths of anomalous polarized layers (APL) of "gas" and „aquiferous stratum" type were defined by VERS method. The VERS data are presented by sounding diagram's and columns, by vertical cross-sections lengthways and transversely of gas penetration zones, by map of thicknesses of man-caused gas "deposit". The perforation on depths of 450 and 310 m was spent in a producing borehole on the first day investigation data. Gas discharges were received from 450 and 310 m depths. Three degassing boreholes have been drilled on 08.11.08 working day. Depths of wells are about 340 m. Gas inflows were received from 330 m depth. Drilling of fourth well was conducted. The anomalous zone area has decreased twice in comparison with two previous surveys. So, the anomaly total area made S=20.7 hectares on 07.10.08, and S=19.7 hectares on 13.10.08 and S=10.5 hectares on 08.11.08. The anomaly intensity has decreased, some local extremum has appeared. All this testifies that there is an intensive degassing process of cross-section upper part through producing wells and the drilled degassing wells. Exclusively important feature of the FSPEF-VERS technology is an operationability(!) the of practical problems solving. For an emergency situation on gas field an operationability of technology has crucial importance. For one day of works only the field staff management has received considerable volume of operative information, allowing in quite proved manner to estimate as accident scales and it possible reasons, and so those threats, which this accident can represent for nearby located settlements. So, the imposing of a sketch-map of distribution of a "man-caused" gas deposit on a map of wells location has shown that this deposit does not extend over field border and, hence, does not represent essential threat for nearby settlements. Technology operationability in a whole and practical experience of repeated measurements testifies about possibility of the FSPEF-VERS methods using for operative carrying out of monitoring character survey. Such monitoring survey can be spent on a field after degassing wells drilling to check the process of gas pump-down from a "man-caused" deposit. Geoelectric researches on an emergency site of field on 08.11.08 and the received thus results practically show efficiency and working capacity of the FSPEF-VERS technology in a monitoring mode. The performed experimental works have shown, that process of gas pump-down from a "man-caused" deposit can be traced in time by the FSPEF-VERS technology. It is expedient to locate the additional degassing wells for definitive elimination of accident consequences with taking into account the data of monitoring works by FSPEF-VERS methods. The experiment results testify of practical possibility of these methods using for operative solving the specific problems of oil- and gas-extraction, as well as they are one more weighty arguments to practicability of the more broad using of FSPEF-VERS technologies in geological prospecting process for oil and gas. Levashov S.P., Yakymchuk N.A., Korchagin I.N., Taskynbaev K.M. (2003) Geoelectric investigations oin Kenbye oilfield in Western Kazakhstan. 65th EAGE Conference & Exhibition, Extended Abstracts P154. Levashov S.P., Yakymchuk M.A. Korchagin I.N., Pyschaniy Ju.M., Yakymchuk Ju.M. (2004) Electric-resonance sounding method and its application for the ecological, geological-geophysical and engineering-geological investigations. 66nd EAGE Conference and Technical Exhibition. Extended abstracts P035.
Wide-field Infrared Survey Explorer Artist Concept
2009-05-18
NASA Wide-field Infrared Survey Explorer mission will survey the entire sky in a portion of the electromagnetic spectrum called the mid-infrared with far greater sensitivity than any previous mission or program ever has.
Lanata, C F; Black, R E
1991-01-01
Traditional survey methods, which are generally costly and time-consuming, usually provide information at the regional or national level only. The utilization of lot quality assurance sampling (LQAS) methodology, developed in industry for quality control, makes it possible to use small sample sizes when conducting surveys in small geographical or population-based areas (lots). This article describes the practical use of LQAS for conducting health surveys to monitor health programmes in developing countries. Following a brief description of the method, the article explains how to build a sample frame and conduct the sampling to apply LQAS under field conditions. A detailed description of the procedure for selecting a sampling unit to monitor the health programme and a sample size is given. The sampling schemes utilizing LQAS applicable to health surveys, such as simple- and double-sampling schemes, are discussed. The interpretation of the survey results and the planning of subsequent rounds of LQAS surveys are also discussed. When describing the applicability of LQAS in health surveys in developing countries, the article considers current limitations for its use by health planners in charge of health programmes, and suggests ways to overcome these limitations through future research. It is hoped that with increasing attention being given to industrial sampling plans in general, and LQAS in particular, their utilization to monitor health programmes will provide health planners in developing countries with powerful techniques to help them achieve their health programme targets.
Automated detection of extended sources in radio maps: progress from the SCORPIO survey
NASA Astrophysics Data System (ADS)
Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.
2016-08-01
Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.
Statistical Methods and Tools for Uxo Characterization (SERDP Final Technical Report)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulsipher, Brent A.; Gilbert, Richard O.; Wilson, John E.
2004-11-15
The Strategic Environmental Research and Development Program (SERDP) issued a statement of need for FY01 titled Statistical Sampling for Unexploded Ordnance (UXO) Site Characterization that solicited proposals to develop statistically valid sampling protocols for cost-effective, practical, and reliable investigation of sites contaminated with UXO; protocols that could be validated through subsequent field demonstrations. The SERDP goal was the development of a sampling strategy for which a fraction of the site is initially surveyed by geophysical detectors to confidently identify clean areas and subsections (target areas, TAs) that had elevated densities of anomalous geophysical detector readings that could indicate the presencemore » of UXO. More detailed surveys could then be conducted to search the identified TAs for UXO. SERDP funded three projects: those proposed by the Pacific Northwest National Laboratory (PNNL) (SERDP Project No. UXO 1199), Sandia National Laboratory (SNL), and Oak Ridge National Laboratory (ORNL). The projects were closely coordinated to minimize duplication of effort and facilitate use of shared algorithms where feasible. This final report for PNNL Project 1199 describes the methods developed by PNNL to address SERDP's statement-of-need for the development of statistically-based geophysical survey methods for sites where 100% surveys are unattainable or cost prohibitive.« less
NASA Astrophysics Data System (ADS)
Sorteberg, Hilleborg K.
2010-05-01
In the hydropower industry, it is important to have precise information about snow deposits at all times, to allow for effective planning and optimal use of the water. In Norway, it is common to measure snow density using a manual method, i.e. the depth and weight of the snow is measured. In recent years, radar measurements have been taken from snowmobiles; however, few energy supply companies use this method operatively - it has mostly been used in connection with research projects. Agder Energi is the first Norwegian power producer in using radar tecnology from helicopter in monitoring mountain snow levels. Measurement accuracy is crucial when obtaining input data for snow reservoir estimates. Radar screening by helicopter makes remote areas more easily accessible and provides larger quantities of data than traditional ground level measurement methods. In order to draw up a snow survey system, it is assumed as a basis that the snow distribution is influenced by vegetation, climate and topography. In order to take these factors into consideration, a snow survey system for fields in high mountain areas has been designed in which the data collection is carried out by following the lines of a grid system. The lines of this grid system is placed in order to effectively capture the distribution of elevation, x-coordinates, y-coordinates, aspect, slope and curvature in the field. Variation in climatic conditions are also captured better when using a grid, and dominant weather patterns will largely be captured in this measurement system.
Radio weak lensing shear measurement in the visibility domain - II. Source extraction
NASA Astrophysics Data System (ADS)
Rivi, M.; Miller, L.
2018-05-01
This paper extends the method introduced in Rivi et al. (2016b) to measure galaxy ellipticities in the visibility domain for radio weak lensing surveys. In that paper, we focused on the development and testing of the method for the simple case of individual galaxies located at the phase centre, and proposed to extend it to the realistic case of many sources in the field of view by isolating visibilities of each source with a faceting technique. In this second paper, we present a detailed algorithm for source extraction in the visibility domain and show its effectiveness as a function of the source number density by running simulations of SKA1-MID observations in the band 950-1150 MHz and comparing original and measured values of galaxies' ellipticities. Shear measurements from a realistic population of 104 galaxies randomly located in a field of view of 1 \\deg ^2 (i.e. the source density expected for the current radio weak lensing survey proposal with SKA1) are also performed. At SNR ≥ 10, the multiplicative bias is only a factor 1.5 worse than what found when analysing individual sources, and is still comparable to the bias values reported for similar measurement methods at optical wavelengths. The additive bias is unchanged from the case of individual sources, but it is significantly larger than typically found in optical surveys. This bias depends on the shape of the uv coverage and we suggest that a uv-plane weighting scheme to produce a more isotropic shape could reduce and control additive bias.
Evaluation of Maryland abutment scour equation through selected threshold velocity methods
Benedict, S.T.
2010-01-01
The U.S. Geological Survey, in cooperation with the Maryland State Highway Administration, used field measurements of scour to evaluate the sensitivity of the Maryland abutment scour equation to the critical (or threshold) velocity variable. Four selected methods for estimating threshold velocity were applied to the Maryland abutment scour equation, and the predicted scour to the field measurements were compared. Results indicated that performance of the Maryland abutment scour equation was sensitive to the threshold velocity with some threshold velocity methods producing better estimates of predicted scour than did others. In addition, results indicated that regional stream characteristics can affect the performance of the Maryland abutment scour equation with moderate-gradient streams performing differently from low-gradient streams. On the basis of the findings of the investigation, guidance for selecting threshold velocity methods for application to the Maryland abutment scour equation are provided, and limitations are noted.
A color video display technique for flow field surveys
NASA Technical Reports Server (NTRS)
Winkelmann, A. E.; Tsao, C. P.
1982-01-01
A computer driven color video display technique has been developed for the presentation of wind tunnel flow field survey data. The results of both qualitative and quantitative flow field surveys can be presented in high spatial resolutions color coded displays. The technique has been used for data obtained with a hot-wire probe, a split-film probe, a Conrad (pitch) probe and a 5-tube pressure probe in surveys above and behind a wing with partially stalled and fully stalled flow.
AGN Variability in the GOODS Fields
NASA Astrophysics Data System (ADS)
Sarajedini, Vicki
2007-07-01
Variability is a proven method to identify intrinsically faint active nuclei in galaxies found in deep HST surveys. We propose to extend our short-term variability study of the GOODS fields to include the more recent epochs obtained via supernovae searchers, increasing the overall time baseline from 6 months to 2.5 years. Based on typical AGN lightcurves, we expect to detect 70% more AGN by including these more recent epochs. Variable-detected AGN samples complement current X-ray and mid-IR surveys for AGN by providing unambigous evidence of nuclear activity. Additionallty, a significant number of variable nuclei are not associated with X-ray or mid-IR sources and would thus go undetected. With the increased time baseline, we will be able to construct the structure function {variability amplitude vs. time} for low-luminosity AGN to z 1. The inclusion of the longer time interval will allow for better descrimination among the various models describing the nature of AGN variability. The variability survey will be compared against spectroscopically selected AGN from the Team Keck Redshift Survey of the GOODS-N and the upcoming Flamingos-II NIR survey of the GOODS-S. The high-resolution ACS images will be used to separate the AGN from the host galaxy light and study the morphology, size and environment of the host galaxy. These studies will address questions concerning the nature of low-luminosity AGN evolution and variability at z 1.
Electron scattering by molecules. II - Experimental methods and data
NASA Technical Reports Server (NTRS)
Trajmar, S.; Chutjian, A.; Register, D. F.
1983-01-01
Experimental techniques for measuring electron-molecule collision cross sections are briefly summarized. A survey of the available experimental cross section data is presented. The emphasis here is on elastic scattering, rotational, vibrational and electronic excitations, total electron scattering, and momentum transfer in the few eV to few hundred eV impact energy range. Reference is made to works concerned with high energy electron scattering, innershell and multi-electron excitations, conicidence methods and electron scattering in laser fields.
Utilizing Urban Environments for Effective Field Experiences
NASA Astrophysics Data System (ADS)
MacAvoy, S. E.; Knee, K.
2014-12-01
Research surveys suggest that students are demanding more applied field experiences from their undergraduate environmental science programs. For geoscience educators at liberal arts colleges without field camps, university vehicles, or even geology departments, getting students into the field is especially rewarding - and especially challenging. Here, we present strategies that we have used in courses ranging from introductory environmental science for non-majors, to upper level environmental methods and geology classes. Urban locations provide an opportunity for a different type of local "field-work" than would otherwise be available. In the upper-level undergraduate Environmental Methods class, we relied on a National Park area located a 10-minute walk from campus for most field exercises. Activities included soil analysis, measuring stream flow and water quality parameters, dendrochronology, and aquatic microbe metabolism. In the non-majors class, we make use of our urban location to contrast water quality in parks and highly channelized urban streams. Here we share detailed lesson plans and budgets for field activities that can be completed during a class period of 2.5 hours with a $75 course fee, show how these activities help students gain quantitative competency, and provide student feedback about the classes and activities.
NASA Astrophysics Data System (ADS)
Repmann, Frank; Gerwin, Werner; Freese, Dirk
2017-04-01
An ever growing demand for energy and the widely proposed switch from fossil fuels to more sustainable energy sources puts the cultivation and use of bioenergy plants into focus. However, bioenergy production on regular and fertile agricultural soils might conflict with the worldwide growing demand for food. To mitigate or omit this potential conflict, the use of low quality or marginal land for cultivation of bioenergy plants becomes favorable. Against this background the definition and assessment of land marginality and, respectively, the evaluation whether and to which extent specific areas are marginal and thus convenient for sustainable bioenergy production, becomes highly relevant. Within the framework of the EU funded Horizon 2020 project SEEMLA, we attempted to asses land marginality of designated test sites in the Ukraine, Greece and Germany by direct field survey. For that purpose, soil and site properties were investigated and evaluated by applying the Muencheberg Soil Quality Rating (SQR) method, developed at the Leibniz Centre for Agricultural Landscape Research (ZALF). The method deploys a comprehensive set of biogeophysical and chemical indicators to describe and finally evaluate the quality of the soil and site by a score ranging from 1 to 100 points. Field survey data were supported by additional laboratory tests on a representative set of soil samples. Practical field work and analysis of field and lab data from the investigated sites proved the applicability of the SQR method within the SEEMLA context. The SQR indices calculated from the field and lab data ranged from 2 to < 40 and clearly demonstrated the marginality of the investigated sites in the Ukraine, Greece and Germany, which differed considerably in respect to their characteristics. Correlating the site quality index to yield data reflecting yield estimations for common bioenergy plants such as willow (Salix sp.), black locust (Robinia pseudoacacia) and poplar (Populus sp.) cultivated at the respective test sites, revealed that SQR might additionally reflect the potential yield of the investigated sites.
A Bike Built for Magnetic Mapping
NASA Astrophysics Data System (ADS)
Schattner, U.; Segev, A.; Lyakhovsky, V.
2017-12-01
Understanding the magnetic signature of the subsurface geology is crucial for structural, groundwater, earthquake propagation, and mineral studies. The cheapest measuring method is by walking with sensors. This approach yields high-resolution maps, yet its coverage is limited. We invented a new design that records magnetic data while riding a bicycle. The new concept offers an efficient, low-cost method of collecting high-resolution ground magnetic field data over rough terrain where conventional vehicles dare not venture. It improves the efficiency of the traditional method by more than five times. The Bike-magnetic scales up ground magnetism from a localized site survey to regional coverage. By now we covered 3300 square KM (about the size of Rhode Island) across northern Israel, in profile spacing of 1-2 km. Initial Total Magnetic Intensity maps reveal a myriad of new features that were not detected by the low-resolution regional aeromagnetic survey that collected data from 1000 m height.
The Forum State of the Field Survey 2011
ERIC Educational Resources Information Center
Kreutzer, Kim
2012-01-01
In the summer of 2011, the Forum on Education Abroad conducted its fourth State of the Field Survey. This survey is an annual or biannual assessment of the very latest trends and issues in the field of education abroad. As in the past, questions on new topics have been combined with questions that have been asked on previous State of the Field…
Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.
2013-01-01
In July (19 - 26) and November (17 - 18) of 1999, the USGS, in cooperation with the Florida Geological Survey (FGS), conducted two geophysical surveys in: (1) the Atlantic Ocean offshore of Florida's east coast from Orchid to Jupiter, FL, and (2) the Gulf of Mexico offshore of Venice, FL. This report serves as an archive of unprocessed digital boomer subbottom data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (showing a relative increase in signal amplitude) digital images of the subbottom profiles are also provided. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, identifiers 99FGS01 and 99FGS02 refer to field data collected in 1999 for cooperative work with the FGS. The numbers 01 and 02 indicate the data were collected during the first and second field activities for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID).
NASA Astrophysics Data System (ADS)
Furlanetto, C.; Dye, S.; Bourne, N.; Maddox, S.; Dunne, L.; Eales, S.; Valiante, E.; Smith, M. W.; Smith, D. J. B.; Ivison, R. J.; Ibar, E.
2018-05-01
This paper forms part of the second major public data release of the Herschel Astrophysical Terahertz Large Area Survey (H-ATLAS). In this work, we describe the identification of optical and near-infrared counterparts to the submillimetre detected sources in the 177 deg2 North Galactic Plane (NGP) field. We used the likelihood ratio method to identify counterparts in the Sloan Digital Sky Survey and in the United Kingdom InfraRed Telescope Imaging Deep Sky Survey within a search radius of 10 arcsec of the H-ATLAS sources with a 4σ detection at 250 μm. We obtained reliable (R ≥ 0.8) optical counterparts with r < 22.4 for 42 429 H-ATLAS sources (37.8 per cent), with an estimated completeness of 71.7 per cent and a false identification rate of 4.7 per cent. We also identified counterparts in the near-infrared using deeper K-band data which covers a smaller ˜25 deg2. We found reliable near-infrared counterparts to 61.8 per cent of the 250-μm-selected sources within that area. We assessed the performance of the likelihood ratio method to identify optical and near-infrared counterparts taking into account the depth and area of both input catalogues. Using catalogues with the same surface density of objects in the overlapping ˜25 deg2 area, we obtained that the reliable fraction in the near-infrared (54.8 per cent) is significantly higher than in the optical (36.4 per cent). Finally, using deep radio data which covers a small region of the NGP field, we found that 80-90 per cent of our reliable identifications are correct.
UAV based hydromorphological mapping of a river reach to improve hydrodynamic numerical models
NASA Astrophysics Data System (ADS)
Lükő, Gabriella; Baranya, Sándor; Rüther, Nils
2017-04-01
Unmanned Aerial Vehicles (UAVs) are increasingly used in the field of engineering surveys. In river engineering, or in general, water resources engineering, UAV based measurements have a huge potential. For instance, indirect measurements of the flow discharge using e.g. large-scale particle image velocimetry (LSPIV), particle tracking velocimetry (PTV), space-time image velocimetry (STIV) or radars became a real alternative for direct flow measurements. Besides flow detection, topographic surveys are also essential for river flow studies as the channel and floodplain geometry is the primary steering feature of the flow. UAVs can play an important role in this field, too. The widely used laser based topographic survey method (LIDAR) can be deployed on UAVs, moreover, the application of the Structure from Motion (SfM) method, which is based on images taken by UAVs, might be an even more cost-efficient alternative to reveal the geometry of distinct objects in the river or on the floodplain. The goal of this study is to demonstrate the utilization of photogrammetry and videogrammetry from airborne footage to provide geometry and flow data for a hydrodynamic numerical simulation of a 2 km long river reach in Albania. First, the geometry of the river is revealed from photogrammetry using the SfM method. Second, a more detailed view of the channel bed at low water level is taken. Using the fine resolution images, a Matlab based code, BASEGrain, developed by the ETH in Zürich, will be applied to determine the grain size characteristics of the river bed. This information will be essential to define the hydraulic roughness in the numerical model. Third, flow mapping is performed using UAV measurements and LSPIV method to quantitatively asses the flow field at the free surface and to estimate the discharge in the river. All data collection and analysis will be carried out using a simple, low-cost UAV, moreover, for all the data processing, open source, freely available software will be used leading to a cost-efficient methodology. The results of the UAV based measurements will be discussed and future research ideas will be outlined.
SpIES: The Spitzer IRAC Equatorial Survey
NASA Technical Reports Server (NTRS)
Timlin, John D.; Ross, Nicholas P.; Richards, Gordon, T.; Lacy, Mark; Ryan, Erin L.; Stone, Robert B.; Bauer, Franz, E.; Brandt, W. N.; Fan, Xiaohui; Glikman, Eilat;
2016-01-01
We describe the first data release from the Spitzer-IRAC Equatorial Survey (SpIES); a large-area survey of approx.115 sq deg in the Equatorial SDSS Stripe 82 field using Spitzer during its "warm" mission phase. SpIES was designed to probe sufficient volume to perform measurements of quasar clustering and the luminosity function at z > or = 3 to test various models for "feedback" from active galactic nuclei (AGNs). Additionally, the wide range of available multi-wavelength, multi-epoch ancillary data enables SpIES to identify both high-redshift (z > or = 5) quasars as well as obscured quasars missed by optical surveys. SpIES achieves 5 sigma depths of 6.13 µJy (21.93 AB magnitude) and 5.75 µJy (22.0 AB magnitude) at 3.6 and 4.5 microns, respectively-depths significantly fainter than the Wide-field Infrared Survey Explorer (WISE). We show that the SpIES survey recovers a much larger fraction of spectroscopically confirmed quasars (approx.98%) in Stripe 82 than are recovered by WISE (55%). This depth is especially powerful at high-redshift (z > or = 3.5), where SpIES recovers 94% of confirmed quasars, whereas WISE only recovers 25%. Here we define the SpIES survey parameters and describe the image processing, source extraction, and catalog production methods used to analyze the SpIES data. In addition to this survey paper, we release 234 images created by the SpIES team and three detection catalogs: a 3.6 microns only detection catalog containing approx. 6.1 million sources, a 4.5 microns only detection catalog containing approx. 6.5 million sources, and a dual-band detection catalog containing approx. 5.4 million sources.
Space weather effects on ground based technology
NASA Astrophysics Data System (ADS)
Clark, T.
Space weather can affect a variety of forms of ground-based technology, usually as a result of either the direct effects of the varying geomagnetic field, or as a result of the induced electric field that accompanies such variations. Technologies affected directly by geomagnetic variations include magnetic measurements made d ringu geophysical surveys, and navigation relying on the geomagnetic field as a direction reference, a method that is particularly common in the surveying of well-bores in the oil industry. The most obvious technology affected by induced electric fields during magnetic storms is electric power transmission, where the example of the blackout in Quebec during the March 1989 magnetic storm is widely known. Additionally, space weather effects must be taken into account in the design of active cathodic protection systems on pipelines to protect them against corrosion. Long-distance telecommunication cables may also have to be designed to cope with space weather related effects. This paper reviews the effects of space weather in these different areas of ground-based technology, and provides examples of how mitigation against hazards may be achieved. (The paper does not include the effects of space weather on radio communication or satellite navigation systems).
Measuring discharge with acoustic Doppler current profilers from a moving boat
Mueller, David S.; Wagner, Chad R.; Rehmel, Michael S.; Oberg, Kevin A.; Rainville, Francois
2013-01-01
The use of acoustic Doppler current profilers (ADCPs) from a moving boat is now a commonly used method for measuring streamflow. The technology and methods for making ADCP-based discharge measurements are different from the technology and methods used to make traditional discharge measurements with mechanical meters. Although the ADCP is a valuable tool for measuring streamflow, it is only accurate when used with appropriate techniques. This report presents guidance on the use of ADCPs for measuring streamflow; this guidance is based on the experience of U.S. Geological Survey employees and published reports, papers, and memorandums of the U.S. Geological Survey. The guidance is presented in a logical progression, from predeployment planning, to field data collection, and finally to post processing of the collected data. Acoustic Doppler technology and the instruments currently (2013) available also are discussed to highlight the advantages and limitations of the technology. More in-depth, technical explanations of how an ADCP measures streamflow and what to do when measuring in moving-bed conditions are presented in the appendixes. ADCP users need to know the proper procedures for measuring discharge from a moving boat and why those procedures are required, so that when the user encounters unusual field conditions, the procedures can be adapted without sacrificing the accuracy of the streamflow-measurement data.
Goldstein, Carly M; Minges, Karl E; Schoffman, Danielle E; Cases, Mallory G
2017-02-01
Behavioral medicine training is due for an overhaul given the rapid evolution of the field, including a tight funding climate, changing job prospects, and new research and industry collaborations. The purpose of the present study was to collect responses from trainee and practicing members of a multidisciplinary professional society about their perceptions of behavioral medicine training and their suggestions for changes to training for future behavioral medicine scientists and practitioners. A total of 162 faculty and 110 students (total n = 272) completed a web-based survey on strengths of their current training programs and ideas for changes. Using a mixed-methods approach, the survey findings are used to highlight seven key areas for improved preparation of the next generation of behavioral medicine scientists and practitioners, which are grant writing, interdisciplinary teamwork, advanced statistics and methods, evolving research program, publishable products from coursework, evolution and use of theory, and non-traditional career paths.
NASA Astrophysics Data System (ADS)
Chen, Daniel T. N.; Wen, Qi; Janmey, Paul A.; Crocker, John C.; Yodh, Arjun G.
2010-04-01
Research on soft materials, including colloidal suspensions, glasses, pastes, emulsions, foams, polymer networks, liquid crystals, granular materials, and cells, has captured the interest of scientists and engineers in fields ranging from physics and chemical engineering to materials science and cell biology. Recent advances in rheological methods to probe mechanical responses of these complex media have been instrumental for producing new understanding of soft matter and for generating novel technological applications. This review surveys these technical developments and current work in the field, with partial aim to illustrate open questions for future research.
Supersonic Coaxial Jet Experiment for CFD Code Validation
NASA Technical Reports Server (NTRS)
Cutler, A. D.; Carty, A. A.; Doerner, S. E.; Diskin, G. S.; Drummond, J. P.
1999-01-01
A supersonic coaxial jet facility has been designed to provide experimental data suitable for the validation of CFD codes used to analyze high-speed propulsion flows. The center jet is of a light gas and the coflow jet is of air, and the mixing layer between them is compressible. Various methods have been employed in characterizing the jet flow field, including schlieren visualization, pitot, total temperature and gas sampling probe surveying, and RELIEF velocimetry. A Navier-Stokes code has been used to calculate the nozzle flow field and the results compared to the experiment.
NASA Astrophysics Data System (ADS)
Jernsletten, J. A.
2004-12-01
This report describes the outcome of a Fast-Turnoff Transient Electro-Magnetic (TEM) geophysical survey carried out in the Peña de Hierro ("Berg of Iron") field area of the Mars Analog Research and Technology Experiment (MARTE), during May and June of 2003. The MARTE Peña de Hierro field area is located between the towns of Rio Tinto and Nerva in the Andalucia region of Spain. It is about one hour drive West of the city of Sevilla, and also about one hour drive North of Huelva. The high concentration of dissolved iron (and smaller amounts of other metals) in the very acidic water in the Rio Tinto area gives the water its characteristic wine red color, and also means that the water is highly conductive, and such an acidic and conductive fluid is highly suited for exploration by electromagnetic methods. This naturally acidic environment is maintained by bacteria in the groundwater and it is these bacteria that are the main focus of the MARTE project overall, and of this supporting geophysical work. It is the goal of this study to be able to map the subsurface extent of the high conductivity (low resistivity) levels, and thus by proxy the subsurface extent of the acidic groundwater and the bacteria populations. In so doing, the viability of using electromagnetic methods for mapping these subsurface metal-rich water bodies is also examined and demonstrated, and the geophysical data will serve to support drilling efforts. The purpose of this field survey was an initial effort to map certain conductive features in the field area, in support of the drilling operations that are central to the MARTE project. These conductive features include the primary target of exploration for MARTE, the very conductive acidic groundwater in the area (which is extremely rich in metals). Other conductive features include the pyretic ore bodies in the area, as well as extensive mine tailings piles.
Characteristics of Hospital-Based Munchausen Syndrome by Proxy in Japan
ERIC Educational Resources Information Center
Fujiwara, Takeo; Okuyama, Makiko; Kasahara, Mari; Nakamura, Ayako
2008-01-01
Objective: This article explores characteristics of Munchausen Syndrome by Proxy (MSBP) in Japan, a country which provides an egalitarian, low cost, and easy-access health care system. Methods: We sent a questionnaire survey to 11 leading doctors in the child abuse field in Japan, each located in different hospital-based sites. Child abuse doctors…
Evaluating ozone air pollution effects on pines in the western United States
Paul R. Miller; Kenneth W. Stolte; Daniel M. Duriscoe; John Pronos
1996-01-01
Historical and technical background is provided about ozone air pollution effects on ponderosa (Pinus ponderosa Dougl. ex Laws) and Jeffrey (P. jeffreyi Grev. and Balf.) pines in forests of the western United States. The principal aim is to document the development of field survey methods to be applied to assessment of chronic...
Current Literature on Venereal Disease, 1973. Number One. Abstracts and Bibliography.
ERIC Educational Resources Information Center
Center for Disease Control (DHEW/PHS), Atlanta, GA.
This report presents a survey of recently published literature in the field of venereal disease. The five main topics covered are a) diagnosis and management of syphilis and other treponematoses, b) gonorrhea, c) minor venereal and related diseases, d) public health methods, and e) behavioral studies. The material in each of these sections…
ERIC Educational Resources Information Center
Ma, Wen; Wang, Chuang
2012-01-01
International students in the United States often employ culture-specific learning strategies to help them improve their proficiency in English. This study explored the use of self-regulated strategies by 49 Chinese graduate students from 24 fields of study at three universities in the Northeast. The research used the mixed survey method to…
Sectional Aluminum Poles Improve Length Measurements in Standing Trees
Joe P. McClure
1968-01-01
The use of sectional aluminum poles to measure lengths in standing trees can reduce bias and improve measurement precision. The method has been tested extensively under a variety of field conditions by Forest Survey crews in the Southeast. Over 16,000 trees with lengths up to 120 feet have been measured over the past 5 years.
ERIC Educational Resources Information Center
Youngs, Peter; Qian, Hong
2013-01-01
In this article, we draw on survey data to investigate associations between Chinese elementary teaching candidates’ mathematical knowledge for teaching (MKT) and their experiences in mathematics courses, mathematics methods courses, and student teaching. In our study, we found that (a) Chinese teaching candidates' completion of courses in number…
Philip Taylor; Jian J. Duan; Roger Fuester
2011-01-01
Classical biological control efforts against emerald ash borer (EAB) (Agrilus planipennis Fairmaire) in North America primarily have focused on introduction and releases of exotic parasitoid species collected from northern parts of China. Recently, field surveys in Michigan, Pennsylvania, Ohio, and Ontario also indicate that some existing parasitoids...
HPERD Administrators' Perspectives Concerning Importance and Practice of Selected Marketing Methods.
ERIC Educational Resources Information Center
Ballew, Jerry L.
This paper reports on the critical role that marketing can have on the health, physical education, recreation, and dance professions (HPERD) and on a national survey of college administrators in the field and their attitudes and practices at the college level. The first half of the paper briefly traces the growing impact of marketing on service…
Students Perceived Value towards Quality of Distance Education in Tamil Nadu
ERIC Educational Resources Information Center
Jeyaraj, P.; Sugumar, D.; Thandavamoorthy, K.; Xavier, S. Joseph
2014-01-01
The quality of education of any distance learning programme is maintained by various ways, such as: quality of study material, internal and external evaluation, and student support methods and so on. The above aspects should be available in aspects to the Post graduate degree students. In this research Ex Post Facto research with field survey is…
Educational Data Mining Applications and Tasks: A Survey of the Last 10 Years
ERIC Educational Resources Information Center
Bakhshinategh, Behdad; Zaiane, Osmar R.; ElAtia, Samira; Ipperciel, Donald
2018-01-01
Educational Data Mining (EDM) is the field of using data mining techniques in educational environments. There exist various methods and applications in EDM which can follow both applied research objectives such as improving and enhancing learning quality, as well as pure research objectives, which tend to improve our understanding of the learning…
ERIC Educational Resources Information Center
McNeese, Rose M.; Roberson, Thelma; Haines, Geoffry
2009-01-01
This manuscript presents findings from a mixed method study that sought to identify the factors that motivate graduate students to pursue a degree in the field of education administration. One hundred sixty-one graduate students from three universities located in Mississippi participated in the study. Participants completed a 10-item survey using…
Ali, Usman; Ahmed, Khawaja Bashrat; Awan, Muhammad Siddique; Asraf, Shaid; Basher, Mohammad; Awan, Mohammad Naeem
2007-09-15
Nine months field survey was conducted from July 2004 to August 2005 to take the data on the distribution and population status of Himalayan ibex (Capra ibex sibirica) in the upper Neelum valley of Azad Kashmir. Survey was carried out using direct (senses) as well as indirect (sampling) methods. 122 animals of different categories were recorded in the study area. Total average population was composed of 31.79% male, 32.79% female, 25.41% young and 9.84% yearling animals. Various threats to the population of ibex in the area were also studied.
NASA Astrophysics Data System (ADS)
Webb, S. J.; Jones, M. Q.; Durrheim, R. J.; Nyblade, A.; Snyman, Q.
2012-12-01
Hard rock exploration and mining presents many opportunities for the effective use of near surface geophysics. For over 10 years the AfricaArray international geophysics field school has been hosted at a variety of mines in South Africa. While the main objective of the field school is practical training for the next generation of geophysicists, being hosted at a mine has allowed us to investigate applications of near surface geophysics in the early stages of mine planning and development as geophysics is often cheaper and faster than drilling. Several applications include: detailed delineation of dykes and stringer dykes, physical property measurements on drill core for modeling and marker horizons, determination of overburden thickness, locations of water and faults. Dolerite dykes are usually magnetic and are associated with loss of ground (i.e. where the dyke replaces the ore and thus reduces the amount of ore available) and safety/stability concerns. Thus the accurate mapping of dykes and narrow stringers that are associated with them are crucial to the safe planning of a mine. We have acquired several case studies where ground magnetic surveys have greatly improved on the resolution and detail of airborne magnetic surveys in regions of complicated dyke swarms. In many cases, thin stringer dykes of less than 5 cm have been detected. Physical property measurements of these dykes can be used to distinguish between different ages of dykes. It is important to accurately determine overburden thickness when planning an open pit mine as this directly affects the cost of development. Depending on the nature of the overburden, both refraction seismic and or DC resistivity can provide continuous profiling in the area of interest that fills in gaps between boreholes. DC resistivity is also effective for determining water associated with dykes and structures that may affect mine planning. The field school mainly addresses the training of a variety of students. The core students are the geophysics Honours students (~4th year undergraduates). In addition, up to 8 students from all over Africa are included in the program to help address practical training in Africa. The final cohort are minority students from the USA. Participants spend a week planning and costing out surveys, a week in the field collecting data using different methods including: gravity, DGPS, magnetics, resistivity, refraction seismic, EM methods, core logging and physical property measurements. The final week is spent interpreting and integrating their results. Graduate students are given the opportunity to instruct on the field school and manage the logistics for a particular method. The field school is unique in Africa and satisfies a need for practical training with limited resources, with a rare blend of cultural interactions!
NASA Astrophysics Data System (ADS)
Hall, S. R.; Anderson, J.; Rajakaruna, N.; Cass, D.
2014-12-01
At the College of the Atlantic, Bar Harbor, Maine, undergraduate students have the opportunity to design their own curriculum within a major of "Human Ecology." To enable students to have early research experiences, we developed a field-based interdisciplinary program for students to learn and practice field methods in a variety of disciplines, Earth Science, Botany, Chemistry, and Wildlife Biology at three specific field sites within a single watershed on Mt. Desert Island. As the Northeast Creek watershed was the site of previous water quality studies, this program of courses enabled continued monitoring of portions of the watershed. The program includes 4 new courses: Critical Zone 1, Critical Zone 2, Wildlife Biology, and Botany. In Critical Zone 1 students are introduced to general topics in Earth Science and learn to use ArcGIS to make basic maps. In Critical Zone 2, Wildlife Biology, and Botany, students are in the field every week using classic field tools and methods. All three of these courses use the same three general field areas: two with working farms at the middle and lower portion of the watershed and one uninhabited forested property in the higher relief headwaters of the watershed. Students collect daily surface water chemistry data at five stream sites within the watershed, complete basic geologic bedrock and geomorphic mapping, conduct wildlife surveys, botanical surveys, and monitor weather patterns at each of the main sites. Beyond the class data collected and synthesized, students also complete group independent study projects at focused field sites, some of which have turned into much larger research projects. This program is an opportunity for students and faculty with varied interests and expertise to work together to study a specific field locality over multiple years. We see this model as enhancing a number of positive education components: field-based learning, teamwork, problem solving, interdisciplinary discussion, multiple faculty interaction, student mentoring, and original research. In the future we see the possibility of welcoming even more interdisciplinary work including rigorous studies spanning the arts and humanities.
NASA Astrophysics Data System (ADS)
Pratt-Sitaula, B.; Charlevoix, D. J.; Douglas, B. J.; Crosby, B. T.; Crosby, C. J.; Lauer, I. H.; Shervais, K.
2017-12-01
Field experiences have long been considered an integral part of geoscience learning. However, as data acquisition technologies evolve, undergraduate field courses need to keep pace so students gain exposure to new technologies relevant to the modern workforce. Maintaining expertise on new technologies is also challenging to established field education programs. Professional development and vetted curriculum present an opportunity to advance student exposure to new geoscience data acquisition technology. The GEodesy Tools for Societal Issues (GETSI) Field Collection, funded by NSF's Improving Undergraduate STEM Education program, addresses these needs in geodesy field education. Geodesy is the science of accurately measuring Earth's size, shape, orientation, mass distribution and the variations of these with time. Modern field geodesy methods include terrestrial laser scanning (TLS), kinematic and static GPS/GNSS surveying (global positioning system/global navigation satellite system), and structure from motion (SfM) photogrammetry. The GETSI Field Collection is a collaborative project between UNAVCO, Indiana University, and Idaho State University. The project is provides curriculum modules and instructor training (in the form of short courses) to facilitate the inclusion of SfM, TLS, and GPS surveying into geoscience courses with field components. The first module - Analyzing High Resolution Topography with TLS and SfM - is available via SERC; (serc.carleton.edu/getsi/teaching_materials/high-rez-topo) the second module - "High Precision Positioning with Static and Kinematic GPS/GNSS" - will be published in 2018. The module development and assessment follows the standards of the InTeGrate Project (an NSF STEP Center)previously tested on geodesy content in the GETSI classroom collection (serc.carleton.edu/getsi). This model emphasizes use of best practices in STEM education, including situating learning in the context of societal importance. Analysis of student work during development and testing shows a high level of achievement of module learning goals. Two four-day short courses have been run to train instructors on best practices for integration of these topics into field courses. Overall participant satisfaction with the short courses has been 9 out of 10.
Field-Effect Biosensors for On-Site Detection: Recent Advances and Promising Targets.
Choi, Jaebin; Seong, Tae Wha; Jeun, Minhong; Lee, Kwan Hyi
2017-10-01
There is an explosive interest in the immediate and cost-effective analysis of field-collected biological samples, as many advanced biodetection tools are highly sensitive, yet immobile. On-site biosensors are portable and convenient sensors that provide detection results at the point of care. They are designed to secure precision in highly ionic and heterogeneous solutions with minimal hardware. Among various methods that are capable of such analysis, field-effect biosensors are promising candidates due to their unique sensitivity, manufacturing scalability, and integrability with computational circuitry. Recent developments in nanotechnological surface modification show promising results in sensing from blood, serum, and urine. This report gives a particular emphasis on the on-site efficacy of recently published field-effect biosensors, specifically, detection limits in physiological solutions, response times, and scalability. The survey of the properties and existing detection methods of four promising biotargets, exosomes, bacteria, viruses, and metabolites, aims at providing a roadmap for future field-effect and other on-site biosensors. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Interest of LQAS method in a survey of HTLV-I infection in Benin (West Africa).
Houinato, Dismand; Preux, Pierre-Marie; Charriere, Bénédicte; Massit, Bruno; Avodé, Gilbert; Denis, François; Dumas, Michel; Boutros-Toni, Fernand; Salamon, Roger
2002-02-01
HTLV-I is heterogeneously distributed in Sub-Saharan Africa. Traditional survey methods as cluster sampling could provide information for a country or region of interest. However, they cannot identify small areas with higher prevalences of infection to help in the health policy planning. Identification of such areas could be done by a Lot Quality Assurance Sampling (LQAS) method, which is currently used in industry to identify a poor performance in assembly lines. The LQAS method was used in Atacora (Northern Benin) between March and May 1998 to identify areas with a HTLV-I seroprevalence higher than 4%. Sixty-five subjects were randomly selected in each of 36 communes (lots) of this department. Lots were classified as unacceptable when the sample contained at least one positive subject. The LQAS method identified 25 (69.4 %) communes with a prevalence higher than 4%. Using stratified sampling theory, the overall HTLV-I seroprevalence was 4.5% (95% CI: 3.6-5.4%). These data show the interest of LQAS method application under field conditions to detect clusters of infection.
SHELS: A complete galaxy redshift survey with R ≤ 20.6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geller, Margaret J.; Hwang, Ho Seong; Fabricant, Daniel G.
2014-08-01
The SHELS (Smithsonian Hectospec Lensing Survey) is a complete redshift survey covering two well-separated fields (F1 and F2) of the Deep Lens Survey to a limiting R = 20.6. Here we describe the redshift survey of the F2 field (R.A.{sub 2000} = 09{sup h}19{sup m}32.4 and decl.{sub 2000} = +30°00'00''). The survey includes 16,294 new redshifts measured with the Hectospec on the MMT. The resulting survey of the 4 deg{sup 2} F2 field is 95% complete to R = 20.6, currently the densest survey to this magnitude limit. The median survey redshift is z = 0.3; the survey provides a viewmore » of structure in the range 0.1 ≲ z ≲ 0.6. An animation displays the large-scale structure in the survey region. We provide a redshift, spectral index D {sub n}4000, and stellar mass for each galaxy in the survey. We also provide a metallicity for each galaxy in the range 0.2« less
Cardarelli, Ettore; Di Filippo, Gerardina
2009-09-01
Resistivity and induced polarization surveying were originally developed for mineral exploration but are now finding new applications in the field of environmental and engineering geophysics. The present article reports the results of a geophysical survey performed with the aim of identifying a plume of chlorinated hydrocarbons in sedimentary formations of the Pandania plain. The tested site is characterized by three sand and gravel aquifers containing a quantity of clay particles which influence the overall bulk resistivity and chargeability. According to data obtained using shallow boreholes, mainly dense non-aqueous phase liquids were found as contaminants in the first and second aquifer. The aforementioned geo-electrical methods were applied in both two- and three-dimensional approaches. Steel and copper electrodes were used in the process of field data acquisition and the results of the survey were compared. The geophysical survey revealed some anomalies that could be explained by the presence of dense non-aqueous phase liquids in the soil medium. The concept of normalized chargeability facilitates the interpretation of detected induced polarization anomalies. The shape of the plume was inferred from maps of resistivity and chargeability to a depth of 25 m below the surface of the ground.
Carter-Pokras, Olivia D.; Spirtas, Robert; Bethune, Lisa; Mays, Vickie; Freeman, Vincent L.; Cozier, Yvette C.
2013-01-01
Purpose In the past decade, we have witnessed increasing numbers of individuals entering the field of epidemiology. With the increase also has come a diversity of training and paths by which individuals entered the field. The purpose of this survey was characterization of the epidemiology workforce, its job diversity, and continuing education needs. Methods The Minority Affairs and Membership committees of the American College of Epidemiology (ACE) prepared and administered a workforce survey to identify racial/ethnic diversity, demographic background, workplace type, credentials, income, subspecialties, and continuing education needs of epidemiologists. The survey was self-administered to attendees of the Second North American Congress of Epidemiology in June 2006. Results A sample of 397 respondents of the 1348 registered for the Congress was captured (29.5% response). Epidemiologists who participated were from 36 states and 18 countries; 54.6% were trained at the doctoral level; 19.1% earned $120,001 or more a year. A wide range of epidemiology subspecialties and continuing education needs were identified. Conclusions This preliminary snapshot of epidemiologists indicates a wide range of training mechanisms, workplace sites, and subspecialties. Results indicate a need for examination of the core graduate training needs of epidemiologist as well as responding to desired professional development needs through the provision of continuing educations efforts. PMID:19344867
Ecology of hymexazol-insensitive Pythium species in field soils.
Ali-Shtayeh, Mohammed; Salah, Ayman M A; Jamous, Rana M
2003-01-01
Soils from 100 irrigated fields (95 under vegetables, 5 under citrus) in different geographical locations in the West Bank (Palestinian Autonomous Territory) were surveyed for hymexazol-insensitive (HIS) Pythium species using the surface soil dilution plate (SSDP) method with the VP3 medium amended with 50 mg/L hymexazol (HMI) (VP3H50), over a period of 12 months. HIS Pythium species were isolated from 37% of the soils surveyed, with mean population levels ranging from 4.3-1422 CFU g(-1) dry weight. Eight HIS Pythium taxa were recovered on the VP3H50 medium, the most abundant of which was P. vexans (found in 29% of field soils surveyed). Seasonal variations in population levels of HIS Pythium species were studied in four fields over a period of 12 months. Significant seasonal variations in HIS population levels were detected in the four fields, with the highest population levels of HIS Pythium spp. encountered in spring and the lowest population levels in winter in three of the fields surveyed. Effects of HMI on linear growth and colony morphology of 149 Pythium ssp. isolates were examined on CMA amended with HMI at five concentrations. Pythium vexans isolates responded differently from those of the other Pythium species. Isolates of this important pathogen were more insensitive to HMI at high concentrations than the other main species tested. A large proportion of the P. ultimum isolates was either insensitive or weakly sensitive to HMI. Furthermore, a few isolates of other Pythium species were insensitive to the fungicide at various concentrations. The colony morphology of P. vexans isolates was not affected by HMI, whereas colonies of the other species showed sparse growth on the HMI amended medium relative to the control. The pathogenicity of P. vexans and P. ultimum isolates to cucumber seedlings was examined in growth chambers. Insensitive isolates of both species were found to be more virulent damping-off pathogens than the sensitive isolates. The present study demonstrates that HMI can not be used effectively in controlling Pythium spp. in soil inhabited with high densities of HIS Pythium spp. pathogens.
Why Don't We Ask? A Complementary Method for Assessing the Status of Great Apes
Meijaard, Erik; Mengersen, Kerrie; Buchori, Damayanti; Nurcahyo, Anton; Ancrenaz, Marc; Wich, Serge; Atmoko, Sri Suci Utami; Tjiu, Albertus; Prasetyo, Didik; Nardiyono; Hadiprakarsa, Yokyok; Christy, Lenny; Wells, Jessie; Albar, Guillaume; Marshall, Andrew J.
2011-01-01
Species conservation is difficult. Threats to species are typically high and immediate. Effective solutions for counteracting these threats, however, require synthesis of high quality evidence, appropriately targeted activities, typically costly implementation, and rapid re-evaluation and adaptation. Conservation management can be ineffective if there is insufficient understanding of the complex ecological, political, socio-cultural, and economic factors that underlie conservation threats. When information about these factors is incomplete, conservation managers may be unaware of the most urgent threats or unable to envision all consequences of potential management strategies. Conservation research aims to address the gap between what is known and what knowledge is needed for effective conservation. Such research, however, generally addresses a subset of the factors that underlie conservation threats, producing a limited, simplistic, and often biased view of complex, real world situations. A combination of approaches is required to provide the complete picture necessary to engage in effective conservation. Orangutan conservation (Pongo spp.) offers an example: standard conservation assessments employ survey methods that focus on ecological variables, but do not usually address the socio-cultural factors that underlie threats. Here, we evaluate a complementary survey method based on interviews of nearly 7,000 people in 687 villages in Kalimantan, Indonesia. We address areas of potential methodological weakness in such surveys, including sampling and questionnaire design, respondent biases, statistical analyses, and sensitivity of resultant inferences. We show that interview-based surveys can provide cost-effective and statistically robust methods to better understand poorly known populations of species that are relatively easily identified by local people. Such surveys provide reasonably reliable estimates of relative presence and relative encounter rates of such species, as well as quantifying the main factors that threaten them. We recommend more extensive use of carefully designed and implemented interview surveys, in conjunction with more traditional field methods. PMID:21483859
Multi-parametric survey for archaeology: how and why, or how and why not?
NASA Astrophysics Data System (ADS)
Hesse, Albert
1999-03-01
Many papers or conference presentations, particularly over the last ten years, have referred to multi-parametric geophysical surveys and integrated interpretations in archaeological prospection. Several experiments of this kind have been undertaken by our laboratory, with mostly fascinating results, but our experience leads us to be rather suspicious of the over-systematic choice of extreme solutions and we would recommend an appropriate and balanced choice, within the limits of the budget available for an operation, between the two following procedures: 1) Routine survey with an extremely large variety of instruments: this allows a better understanding of the underground situation than survey with a single instrument but reduces the area that can be surveyed. A limited number of specific circumstances should lead one to adopt this option. They include: previous knowledge or equally previous ignorance of the targets under investigation, preliminary selection of the most efficient method on a scientific and economic basis, comparative experiments for the validation of new tools, specific detection of targets of different nature into the ground as well as uncertainty about the efficiency of each available method for the actual nature of the investigated site. 2) Survey of a much larger area with only one method, chosen because it is particularly fast and efficient: there is an obvious value in extensive exploration in order to evaluate the size, distribution and limits of a large number of archaeological features. The strict selection of appropriate methods, chosen to meet the aims of a project should consider not only geophysics but all kinds of conventional or non-conventional archaeological methods as well, brought together to permit an integrated interpretation. This highly specialized job does not fall within the normal experience of exploration geophysicists who usually deal with geological features or most field archaeologists who are mainly involved in excavations. It must be undertaken by particularly trained operators, whether they belong to private companies (under appropriate official control) or to public organizations.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-16
...) Information Collection Request Title: Field Metering of Electrical Appliances; (3) Type of Request: New; (4) Purpose: Metered field data from electrical appliances is necessary to support characterization of energy... Conservation Program for Consumer Products: Survey of Field Energy Consumption of Residential Refrigerators...
Detection of a possible superluminous supernova in the epoch of reionization
NASA Astrophysics Data System (ADS)
Mould, Jeremy; Abbott, Tim; Cooke, Jeff; Curtin, Chris; Katsiani, Antonios; Koekemoer, Anton; Tescari, Edoardo; Uddin, Syed; Wang, Lifan; Wyithe, Stuaet
2017-04-01
An interesting transient has been detected in one of our three Dark Energy Camera deep fields. Observations of these deep fields take advantage of the high red sensitivity of DECam on the Cerro Tololo Interamerican Observatory Blanco telescope. The survey includes the Y band with rest wavelength 1430{Å} at z = 6. Survey fields (the Prime field 0555-6130, the 16hr field 1600-75 and the SUDSS New Southern Field) are deeper in Y than other infrared surveys. They are circumpolar, allowing all night to be used efficiently, exploiting the moon tolerance of 1 micron observations to minimize conflict with the Dark Energy Survey. As an i-band dropout (meaning that the flux decrement shortward of Lyman alpha is in the i bandpass), the transient we report here is a supernova candidate with z 6, with a luminosity comparable to the brightest known current epoch superluminous supernova (i.e., 2 x 10^11 solar luminosities).
A field guide to amphibian larvae and eggs of Minnesota, Wisconsin, and Iowa
Parmelee, J.R.; Knutson, M.G.; Lyon, J.E.
2002-01-01
Apparent worldwide declines in amphibian populations (Pechmann and Wake 1997) have stimulated interest in amphibians as bioindicators of the health of ecosystems. Because we have little information on the population status of many species, there is interest by public and private land management agencies in monitoring amphibian populations. Amphibian egg and larval surveys are established methods of surveying pond-breeding amphibians. Adults may be widely dispersed across the landscape, but eggs and larvae are confined to the breeding site during a specific season of the year. Also, observations of late-stage larvae or metamorphs are evidence of successful reproduction, which is an important indicator of the viability of the population. The goal of this guide is to help students, natural resources personnel, and biologists identify eggs and larval stages of amphibians in the field without the aid of a microscope.
ESO/ST-ECF Data Analysis Workshop, 5th, Garching, Germany, Apr. 26, 27, 1993, Proceedings
NASA Astrophysics Data System (ADS)
Grosbol, Preben; de Ruijsscher, Resy
1993-01-01
Various papers on astronomical data analysis are presented. Individual optics addressed include: surface photometry of early-type galaxies, wavelet transform and adaptive filtering, package for surface photometry of galaxies, calibration of large-field mosaics, surface photometry of galaxies with HST, wavefront-supported image deconvolution, seeing effects on elliptical galaxies, multiple algorithms deconvolution program, enhancement of Skylab X-ray images, MIDAS procedures for the image analysis of E-S0 galaxies, photometric data reductions under MIDAS, crowded field photometry with deconvolved images, the DENIS Deep Near Infrared Survey. Also discussed are: analysis of astronomical time series, detection of low-amplitude stellar pulsations, new SOT method for frequency analysis, chaotic attractor reconstruction and applications to variable stars, reconstructing a 1D signal from irregular samples, automatic analysis for time series with large gaps, prospects for content-based image retrieval, redshift survey in the South Galactic Pole Region.
National Household Education Surveys of 2003: Data File User's Manual, Volume I. NCES 2004-001
ERIC Educational Resources Information Center
Hagedorn, Mary; Montaquila, Jill; Vaden-Kiernan, Nancy; Kim, Kwang; Chapman, Christopher
2004-01-01
This manual describes the development of the surveys fielded in 2003 under the National Household Education Surveys Program (NHES: 2003). It describes how the questionnaires were designed, how the samples were developed, data collection experiences, and file information needed to analyze the NHES: 2003 data sets. The surveys fielded as part of…
The Forum State of the Field Survey, 2008
ERIC Educational Resources Information Center
Kreutzer, Kim; Blessing, Charlotte; Rayner, Elise
2009-01-01
This paper presents the results from the Forum on Education Abroad's 2008 State of the Field Survey. The Survey provides information on the funding, cost and value of education abroad that will be useful to incorporate into strategic planning. While the Survey shows that there is concern about the rising costs of and relative lack of funding for…
Development of Vertical Cable Seismic System (2)
NASA Astrophysics Data System (ADS)
Asakawa, E.; Murakami, F.; Tsukahara, H.; Ishikawa, K.
2012-12-01
The vertical cable seismic is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. This type of survey is generally called VCS (Vertical Cable Seismic). Because VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed the method for the hydrothermal deposit survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We are now developing a VCS system, including not only data acquisition hardware but data processing and analysis technique. Our first experiment of VCS surveys has been carried out in Lake Biwa, JAPAN in November 2009 for a feasibility study. Prestack depth migration is applied to the 3D VCS data to obtain a high quality 3D depth volume. Based on the results from the feasibility study, we have developed two autonomous recording VCS systems. After we carried out a trial experiment in the actual ocean at a water depth of about 400m and we carried out the second VCS survey at Iheya Knoll with a deep-towed source. In this survey, we could establish the procedures for the deployment/recovery of the system and could examine the locations and the fluctuations of the vertical cables at a water depth of around 1000m. The acquired VCS data clearly shows the reflections from the sub-seafloor. Through the experiment, we could confirm that our VCS system works well even in the severe circumstances around the locations of seafloor hydrothermal deposits. We have carried out two field surveys in 2011. One is a 3D survey with a boomer for a high-resolution surface source and the other one for an actual field survey in the Izena Cauldron an active hydrothermal area in the Okinawa Trough. Through these surveys, we have confirmed that the uncertainty in the locations of the source and of the hydrophones in water could lower the quality of subsurface image. It is, therefore, strongly necessary to develop a total survey system that assures an accurate positioning and a deployment techniques. In case of shooting on sea surface, GPS navigation system are available, but in case of deep-towed source or ocean bottom source, the accuracy of shot position with SSBL/USBL is not sufficient for the very high-resolution imaging as requested for the SMS survey. We will incorporate the accurate LBL navigation systems with VCs. The LBL navigation system has been developed by IIS of the University of Tokyo. The error is estimated less than 10cm at the water depth of 3000m. Another approach is that the shot points can be calculated using the first break of the VCS after the VCS locations are estimated by slant-ranging from the sea surface. Our VCS system has been designed as a survey tool for hydrothermal deposit, but it will be also applicable for deep water site surveys or geohazard assessment such as active faults.
A survey of enabling technologies in synthetic biology
2013-01-01
Background Realizing constructive applications of synthetic biology requires continued development of enabling technologies as well as policies and practices to ensure these technologies remain accessible for research. Broadly defined, enabling technologies for synthetic biology include any reagent or method that, alone or in combination with associated technologies, provides the means to generate any new research tool or application. Because applications of synthetic biology likely will embody multiple patented inventions, it will be important to create structures for managing intellectual property rights that best promote continued innovation. Monitoring the enabling technologies of synthetic biology will facilitate the systematic investigation of property rights coupled to these technologies and help shape policies and practices that impact the use, regulation, patenting, and licensing of these technologies. Results We conducted a survey among a self-identifying community of practitioners engaged in synthetic biology research to obtain their opinions and experiences with technologies that support the engineering of biological systems. Technologies widely used and considered enabling by survey participants included public and private registries of biological parts, standard methods for physical assembly of DNA constructs, genomic databases, software tools for search, alignment, analysis, and editing of DNA sequences, and commercial services for DNA synthesis and sequencing. Standards and methods supporting measurement, functional composition, and data exchange were less widely used though still considered enabling by a subset of survey participants. Conclusions The set of enabling technologies compiled from this survey provide insight into the many and varied technologies that support innovation in synthetic biology. Many of these technologies are widely accessible for use, either by virtue of being in the public domain or through legal tools such as non-exclusive licensing. Access to some patent protected technologies is less clear and use of these technologies may be subject to restrictions imposed by material transfer agreements or other contract terms. We expect the technologies considered enabling for synthetic biology to change as the field advances. By monitoring the enabling technologies of synthetic biology and addressing the policies and practices that impact their development and use, our hope is that the field will be better able to realize its full potential. PMID:23663447
Mariano, John; Grauch, V.J.
1988-01-01
Aeromagnetic anomalies are produced by variations in the strength and direction of the magnetic field of rocks that include magnetic minerals, commonly magnetite. Patterns of anomalies on aeromagnetic maps can reveal structures - for example, faults which have juxtaposed magnetic rocks against non-magnetic rocks, or areas of alteration where magnetic minerals have been destroyed by hydrothermal activity. Tectonic features of regional extent may not become apparent until a number of aeromagnetic surveys have been compiled and plotted at the same scale. Commonly the compilation involves piecing together data from surveys that were flown at different times with widely disparate flight specifications and data reduction procedures. The data may be compiled into a composite map, where all the pieces are plotted onto one map without regard to the difference in flight elevation and datum, or they may be compiled into a merged map, where all survey data are analytically reduced to a common flight elevation and datum, and then digitally merged at the survey boundaries. The composite map retains the original resolution of all the survey data, but computer methods to enhance regional features crossing the survey boundaries may not be applied. On the other hand, computer methods can be applied to the merged data, but the accuracy of the data may be slightly diminished.
L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne
2018-01-01
Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries.
Baldissera, Sandro; Ferrante, Gianluigi; Quarchioni, Elisa; Minardi, Valentina; Possenti, Valentina; Carrozzi, Giuliano; Masocco, Maria; Salmaso, Stefania
2014-04-01
Field substitution of nonrespondents can be used to maintain the planned sample size and structure in surveys but may introduce additional bias. Sample weighting is suggested as the preferable alternative; however, limited empirical evidence exists comparing the two methods. We wanted to assess the impact of substitution on surveillance results using data from Progressi delle Aziende Sanitarie per la Salute in Italia-Progress by Local Health Units towards a Healthier Italy (PASSI). PASSI is conducted by Local Health Units (LHUs) through telephone interviews of stratified random samples of residents. Nonrespondents are replaced with substitutes randomly preselected in the same LHU stratum. We compared the weighted estimates obtained in the original PASSI sample (used as a reference) and in the substitutes' sample. The differences were evaluated using a Wald test. In 2011, 50,697 units were selected: 37,252 were from the original sample and 13,445 were substitutes; 37,162 persons were interviewed. The initially planned size and demographic composition were restored. No significant differences in the estimates between the original and the substitutes' sample were found. In our experience, field substitution is an acceptable method for dealing with nonresponse, maintaining the characteristics of the original sample without affecting the results. This evidence can support appropriate decisions about planning and implementing a surveillance system. Copyright © 2014 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cecil, L.D.; Knobel, L.L.; Wegner, S.J.
1989-09-01
From 1952 to 1988, about 140 curies of strontium-90 have been discharged in liquid waste to disposal ponds and wells at the INEL (Idaho National Engineering Laboratory). The US Geological Survey routinely samples ground water from the Snake River Plain aquifer and from discontinuous perched-water zones for selected radionuclides, major and minor ions, and chemical and physical characteristics. Water samples for strontium-90 analyses collected in the field are unfiltered and preserved to an approximate 2-percent solution with reagent-grade hydrochloric acid. Water from four wells completed in the Snake River Plain aquifer was sampled as part of the US Geological Survey'smore » quality-assurance program to evaluate the effect of filtration and preservation methods on strontium-90 concentrations in ground water at the INEL. The wells were selected for sampling on the basis of historical concentrations of strontium-90 in ground water. Water from each well was filtered through either a 0.45- or a 0.1-micrometer membrane filter; unfiltered samples also were collected. Two sets of filtered and two sets of unfiltered water samples were collected at each well. One set of water samples was preserved in the field to an approximate 2-percent solution with reagent-grade hydrochloric acid and the other set of samples was not acidified. 13 refs., 2 figs., 6 tabs.« less
Bayesian power spectrum inference with foreground and target contamination treatment
NASA Astrophysics Data System (ADS)
Jasche, J.; Lavaux, G.
2017-10-01
This work presents a joint and self-consistent Bayesian treatment of various foreground and target contaminations when inferring cosmological power spectra and three-dimensional density fields from galaxy redshift surveys. This is achieved by introducing additional block-sampling procedures for unknown coefficients of foreground and target contamination templates to the previously presented ARES framework for Bayesian large-scale structure analyses. As a result, the method infers jointly and fully self-consistently three-dimensional density fields, cosmological power spectra, luminosity-dependent galaxy biases, noise levels of the respective galaxy distributions, and coefficients for a set of a priori specified foreground templates. In addition, this fully Bayesian approach permits detailed quantification of correlated uncertainties amongst all inferred quantities and correctly marginalizes over observational systematic effects. We demonstrate the validity and efficiency of our approach in obtaining unbiased estimates of power spectra via applications to realistic mock galaxy observations that are subject to stellar contamination and dust extinction. While simultaneously accounting for galaxy biases and unknown noise levels, our method reliably and robustly infers three-dimensional density fields and corresponding cosmological power spectra from deep galaxy surveys. Furthermore, our approach correctly accounts for joint and correlated uncertainties between unknown coefficients of foreground templates and the amplitudes of the power spectrum. This effect amounts to correlations and anti-correlations of up to 10 per cent across wide ranges in Fourier space.
Untangling Galaxy Components - The Angular Momentum Parameter
NASA Astrophysics Data System (ADS)
Tabor, Martha; Merrifield, Michael; Aragon-Salamanca, Alfonso
2017-06-01
We have developed a new technique to decompose Integral Field spectral data cubes into separate bulge and disk components, allowing us to study the kinematic and stellar population properties of the individual components and how they vary with position. We present here the application of this method to a sample of fast rotator early type galaxies from the MaNGA integral field survey, and demonstrate how it can be used to explore key properties of the individual components. By extracting ages, metallicities and the angular momentum parameter lambda of the bulges and disks, we show how this method can give us new insights into the underlying structure of the galaxies and discuss what this can tell us about their evolution history.
NASA Astrophysics Data System (ADS)
Kasprak, A.; Wheaton, J. M.; Bouwes, N.; Weber, N. P.; Trahan, N. C.; Jordan, C. E.
2012-12-01
River managers often seek to understand habitat availability and quality for riverine organisms within the physical template provided by their landscape. Yet the large amount of natural heterogeneity in landscapes gives rise to stream systems which are highly variable over small spatial scales, potentially complicating site selection for surveying aquatic habitat while simultaneously making a simple, wide-reaching management strategy elusive. This is particularly true in the rugged John Day River Basin of northern Oregon, where efforts as part of the Columbia Habitat Monitoring Program to conduct site-based surveys of physical habitat for endangered steelhead salmon (Oncorhynchus mykiss) are underway. As a complete understanding of the type and distribution of habitat available to these fish would require visits to all streams in the basin (impractical due to its large size), here we develop an approach for classifying channel types which combines remote desktop GIS analyses with rapid field-based stream and landscape surveys. At the core of this method, we build off of the River Styles Framework, an open-ended and process-based approach for classifying streams and informing management decisions. This framework is combined with on-the-ground fluvial audits, which aim to quickly and continuously map sediment dynamics and channel behavior along selected channels. Validation of this classification method is completed by on-the-ground stream surveys using a digital iPad platform and by rapid small aircraft overflights to confirm or refine predictions. We further compare this method with existing channel classification approaches for the region (e.g. Beechie, Montgomery and Buffington). The results of this study will help guide both the refinement of site stratification and selection for salmonid habitat monitoring within the basin, and will be vital in designing and prioritizing restoration and management strategies tailored to the distribution of river styles found across the region.
Uncertainties estimation in surveying measurands: application to lengths, perimeters and areas
NASA Astrophysics Data System (ADS)
Covián, E.; Puente, V.; Casero, M.
2017-10-01
The present paper develops a series of methods for the estimation of uncertainty when measuring certain measurands of interest in surveying practice, such as points elevation given a planimetric position within a triangle mesh, 2D and 3D lengths (including perimeters enclosures), 2D areas (horizontal surfaces) and 3D areas (natural surfaces). The basis for the proposed methodology is the law of propagation of variance-covariance, which, applied to the corresponding model for each measurand, allows calculating the resulting uncertainty from known measurement errors. The methods are tested first in a small example, with a limited number of measurement points, and then in two real-life measurements. In addition, the proposed methods have been incorporated to commercial software used in the field of surveying engineering and focused on the creation of digital terrain models. The aim of this evolution is, firstly, to comply with the guidelines of the BIPM (Bureau International des Poids et Mesures), as the international reference agency in the field of metrology, in relation to the determination and expression of uncertainty; and secondly, to improve the quality of the measurement by indicating the uncertainty associated with a given level of confidence. The conceptual and mathematical developments for the uncertainty estimation in the aforementioned cases were conducted by researchers from the AssIST group at the University of Oviedo, eventually resulting in several different mathematical algorithms implemented in the form of MATLAB code. Based on these prototypes, technicians incorporated the referred functionality to commercial software, developed in C++. As a result of this collaboration, in early 2016 a new version of this commercial software was made available, which will be the first, as far as the authors are aware, that incorporates the possibility of estimating the uncertainty for a given level of confidence when computing the aforementioned surveying measurands.
Developing Gyrfalcon surveys and monitoring for Alaska
Fuller, Mark R.; Schempf, Philip F.; Booms, Travis L.
2011-01-01
We developed methods to monitor the status of Gyrfalcons in Alaska. Results of surveys and monitoring will be informative for resource managers and will be useful for studying potential changes in ecological communities of the high latitudes. We estimated that the probability of detecting a Gyrfalcon at an occupied nest site was between 64% and 87% depending on observer experience and aircraft type (fixed-wing or helicopter). The probability of detection is an important factor for estimating occupancy of nesting areas, and occupancy can be used as a metric for monitoring species' status. We conclude that surveys of nesting habitat to monitor occupancy during the breeding season are practical because of the high probability of seeing a Gyrfalcon from aircraft. Aerial surveys are effective for searching sample plots or index areas in the expanse of the Alaskan terrain. Furthermore, several species of cliff-nesting birds can be surveyed concurrently from aircraft. Occupancy estimation also can be applied using data from other field search methods (e.g., from boats) that have proven useful in Alaska. We believe a coordinated broad-scale, inter-agency, collaborative approach is necessary in Alaska. Monitoring can be facilitated by collating and archiving each set of results in a secure universal repository to allow for statewide meta-analysis.
Measuring Extinction in Local Group Galaxies Using Background Galaxies
NASA Astrophysics Data System (ADS)
Wyder, T. K.; Hodge, P. W.
1999-05-01
Knowledge of the distribution and quantity of dust in galaxies is important for understanding their structure and evolution. The goal of our research is to measure the total extinction through Local Group galaxies using measured properties of background galaxies. Our method relies on the SExtractor software as an objective and automated method of detecting background galaxies. In an initial test, we have explored two WFPC2 fields in the SMC and two in M31 obtained from the HST archives. The two pointings in the SMC are fields around the open clusters L31 and B83 while the two M31 fields target the globular clusters G1 and G170. Except for the G1 observations of M31, the fields chosen are very crowded (even when observed with HST) and we chose them as a particularly stringent test of the method. We performed several experiments using a series of completeness tests that involved superimposing comparison fields, adjusted to the equivalent exposure time, from the HST Medium-Deep and Groth-Westphal surveys. These tests showed that for crowded fields, such as the two in the core of the SMC and the one in the bulge of M31, this automated method of detecting galaxies can be completely dominated by the effects of crowding. For these fields, only a small fraction of the added galaxies was recovered. However, in the outlying G1 field in M31, almost all of the added galaxies were recovered. The numbers of actual background galaxies in this field are consistent with zero extinction. As a follow-up experiment, we used image processing techniques to suppress stellar objects while enhancing objects with non-stellar, more gradual luminosity profiles. This method yielded significant numbers of background galaxies in even the most crowded fields, which we are now analyzing to determine the total extinction and reddening caused by the foreground galaxy.
The development of a survey instrument for community health improvement.
Bazos, D A; Weeks, W B; Fisher, E S; DeBlois, H A; Hamilton, E; Young, M J
2001-01-01
OBJECTIVE: To develop a survey instrument that could be used both to guide and evaluate community health improvement efforts. DATA SOURCES/STUDY SETTING: A randomized telephone survey was administered to a sample of about 250 residents in two communities in Lehigh Valley, Pennsylvania in the fall of 1997. METHODS: The survey instrument was developed by health professionals representing diverse health care organizations. This group worked collaboratively over a period of two years to (1) select a conceptual model of health as a foundation for the survey; (2) review relevant literature to identify indicators that adequately measured the health constructs within the chosen model; (3) develop new indicators where important constructs lacked specific measures; and (4) pilot test the final survey to assess the reliability and validity of the instrument. PRINCIPAL FINDINGS: The Evans and Stoddart Field Model of the Determinants of Health and Well-Being was chosen as the conceptual model within which to develop the survey. The Field Model depicts nine domains important to the origins and production of health and provides a comprehensive framework from which to launch community health improvement efforts. From more than 500 potential indicators we identified 118 survey questions that reflected the multiple determinants of health as conceptualized by this model. Sources from which indicators were selected include the Behavior Risk Factor Surveillance Survey, the National Health Interview Survey, the Consumer Assessment of Health Plans Survey, and the SF-12 Summary Scales. The work group developed 27 new survey questions for constructs for which we could not locate adequate indicators. Twenty-five questions in the final instrument can be compared to nationally published norms or benchmarks. The final instrument was pilot tested in 1997 in two communities. Administration time averaged 22 minutes with a response rate of 66 percent. Reliability of new survey questions was adequate. Face validity was supported by previous findings from qualitative and quantitative studies. CONCLUSIONS: We developed, pilot tested, and validated a survey instrument designed to provide more comprehensive and timely data to communities for community health assessments. This instrument allows communities to identify and measure critical domains of health that have previously not been captured in a single instrument. PMID:11508639
New Observational Constraints to Milky Way Chemodynamical Models
NASA Astrophysics Data System (ADS)
Chiappini, Cristina; Minchev, Ivan; Anders, Friedrich; Brauer, Dorothee; Boeche, Corrado; Martig, Marie
Galactic Archaeology, i.e. the use of chemo-dynamical information for stellar samples covering large portions of the Milky Way to infer the dominant processes involved in its formation and evolution, is now a powerful method thanks to the large recently completed and ongoing spectroscopic surveys. It is now important to ask the right questions when analyzing and interpreting the information contained in these rich datasets. To this aim, we have developed a chemodynamical model for the Milky Way that provides quantitative predictions to be compared with the chemo-kinematical properties extracted from the stellar spectra. Three key parameters are needed to make the comparison between data and model predictions useful in order to advance in the field, namely: precise proper-motions, distances and ages. The uncertainties involved in the estimate of ages and distances for field stars are currently the main obstacles in the Galactic Archaeology method. Two important developments might change this situation in the near future: asteroseismology and the now launched Gaia. When combined with the large datasets from surveys like RAVE, SEGUE, LAMOST, Gaia-ESO, APOGEE, HERMES and the future 4MOST we will have the basic ingredients for the reconstruction of the MW history in hands. In the light of these observational advances, the development of detailed chemo-dynamical models tailored to the Milky Way is urgently needed in the field. Here we show the steps we have taken, both in terms of data analysis and modelling. The examples shown here illustrate how powerful can the Galactic Archaeology method become once ages and distances are known with better precision than what is currently feasible.
Women's mental health research: the emergence of a biomedical field.
Blehar, Mary C
2006-01-01
This review surveys the field of women's mental health, with particular emphasis on its evolution into a distinct area of biomedical research. The field employs a biomedical disease model but it also emphasizes social and cultural influences on health outcomes. In recent years, its scope has expanded beyond studies of disorders occurring in women at times of reproductive transitions and it now encompasses a broader study of sex and gender differences. Historical and conceptual influences on the field are discussed. The review also surveys gender differences in the prevalence and clinical manifestations of mental disorders. Epidemiological findings have provided a rich resource for theory development, but without research tools to test theories adequately, findings of gender differences have begged the question of their biological, social, and cultural origins. Clinical depression is used to exemplify the usefulness of a sex/gender perspective in understanding mental illness; and major theories proposed to account for gender differences are critically evaluated. The National Institutes of Health (NIH) is the primary federal funding source for biomedical women's mental health research. The review surveys areas of emphasis in women's mental health research at the NIH as well as some collaborative activities that represent efforts to translate research findings into the public health and services arenas. As new analytic methods become available, it is anticipated that a more fundamental understanding of the biological and behavioral mechanisms underlying sex and gender differences in mental illness will emerge. Nonetheless, it is also likely that integration of findings predicated on different conceptual models of the nature and causes of mental illness will remain a challenge. These issues are discussed with reference to their impact on the field of women's mental health research.
Calibration of HST wide field camera for quantitative analysis of faint galaxy images
NASA Technical Reports Server (NTRS)
Ratnatunga, Kavan U.; Griffiths, Richard E.; Casertano, Stefano; Neuschaefer, Lyman W.; Wyckoff, Eric W.
1994-01-01
We present the methods adopted to optimize the calibration of images obtained with the Hubble Space Telescope (HST) Wide Field Camera (WFC) (1991-1993). Our main goal is to improve quantitative measurement of faint images, with special emphasis on the faint (I approximately 20-24 mag) stars and galaxies observed as a part of the Medium-Deep Survey. Several modifications to the standard calibration procedures have been introduced, including improved bias and dark images, and a new supersky flatfield obtained by combining a large number of relatively object-free Medium-Deep Survey exposures of random fields. The supersky flat has a pixel-to-pixel rms error of about 2.0% in F555W and of 2.4% in F785LP; large-scale variations are smaller than 1% rms. Overall, our modifications improve the quality of faint images with respect to the standard calibration by about a factor of five in photometric accuracy and about 0.3 mag in sensitivity, corresponding to about a factor of two in observing time. The relevant calibration images have been made available to the scientific community.
Wagner, Richard J.; Boulger, Robert W.; Oblinger, Carolyn J.; Smith, Brett A.
2006-01-01
The U.S. Geological Survey uses continuous water-quality monitors to assess the quality of the Nation's surface water. A common monitoring-system configuration for water-quality data collection is the four-parameter monitoring system, which collects temperature, specific conductance, dissolved oxygen, and pH data. Such systems also can be configured to measure other properties, such as turbidity or fluorescence. Data from sensors can be used in conjunction with chemical analyses of samples to estimate chemical loads. The sensors that are used to measure water-quality field parameters require careful field observation, cleaning, and calibration procedures, as well as thorough procedures for the computation and publication of final records. This report provides guidelines for site- and monitor-selection considerations; sensor inspection and calibration methods; field procedures; data evaluation, correction, and computation; and record-review and data-reporting processes, which supersede the guidelines presented previously in U.S. Geological Survey Water-Resources Investigations Report WRIR 00-4252. These procedures have evolved over the past three decades, and the process continues to evolve with newer technologies.
23 CFR 668.111 - Application procedures.
Code of Federal Regulations, 2010 CFR
2010-04-01
... intent to apply for ER funds. (b) Damage survey. As soon as practical after occurrence, the State will make a preliminary field survey, working cooperatively with the FHWA Division Administrator and other governmental agencies with jurisdiction over eligible highways. The preliminary field survey should be...
Assessing safety awareness and knowledge and behavioral change among West Virginia loggers
Helmkamp, J; Bell, J; Lundstrom, W; Ramprasad, J; Haque, A
2004-01-01
Objective: To determine if a video used during logger training influences safety attitude, knowledge, and workplace habits. Method: From April 2002 to October 2003, loggers receiving training through the West Virginia Division of Forestry were given a new safety module. This consisted of a pre-training survey, viewing video, brief introduction to field safety guide, and an immediate post-training survey. Six months after training, loggers were contacted by telephone to assess workplace behavioral changes. Results: 1197 loggers attended 80 training sessions and completed surveys; 21% were contacted at follow up. Pre-training surveys indicated that half said "accidents" were part of the job and had experienced a "close call" in their work. An overwhelming majority felt that safety management and periodic meetings were important. Over 75% indicated they would not take risks in order to make a profit. Several statistically significant improvements were noted in safety knowledge after viewing the video: logger's location in relation to the tree stump during fatal incidents and the pictorial identification of an overloaded truck and the safest cutting notch. At follow up, many of the loggers said they related to the real life victim stories portrayed in the video. Further, the field guide served as a quick and easy reference and taught them valuable tips on safe cutting and felling. Conclusions: Significant changes in safety knowledge and attitude among certified loggers resulted from viewing the video during training. Subsequent use of the video and field guide at the worksite encouraged positive change in self reported work habits and practices. PMID:15314051
The influence of television and film on interest in space and science
NASA Astrophysics Data System (ADS)
Jackson, Katrina Marie
Entertainment media has the great potential to inspire interest in the topics it presents. The purpose of this study is to better understand how entertainment media contributes to people's interests in space and science. There is a huge variety of science communication topics in previous literature, some of which deals with television and film, but very little that specifically study how television and film can inspire interest. A historical review of pioneers in the space industry shows that many were inspired by entertainment media, which at the time consisted of science fiction novels and magazines. In order to explore the possible relationships among influences for scientists and non-scientists and to determine specific questions for future research, I created and distributed an anonymous, online survey. The survey is suggestive, exploratory research using a convenience sampling method and is not meant to provide scientifically accurate statistics. 251 participants completed the survey; 196 were scientists and 55 were non-scientists. The survey showed that the participants did identify entertainment media as a major influencing factor, on a comparable level as factors such as classes or family members. Participants in space-related fields were influenced by entertainment media more than the participants in other fields were. I identified several questions for future research, such as: Are people in space-related fields inspired by entertainment media more than other scientists are? Are non-space-related scientists often inspired by space-related media? Do people who regularly watch science fiction tend to be more scientifically literate than average?
Probing Neutrino Hierarchy and Chirality via Wakes.
Zhu, Hong-Ming; Pen, Ue-Li; Chen, Xuelei; Inman, Derek
2016-04-08
The relic neutrinos are expected to acquire a bulk relative velocity with respect to the dark matter at low redshifts, and neutrino wakes are expected to develop downstream of the dark matter halos. We propose a method of measuring the neutrino mass based on this mechanism. This neutrino wake will cause a dipole distortion of the galaxy-galaxy lensing pattern. This effect could be detected by combining upcoming lensing surveys with a low redshift galaxy survey or a 21 cm intensity mapping survey, which can map the neutrino flow field. The data obtained with LSST and Euclid should enable us to make a positive detection if the three neutrino masses are quasidegenerate with each neutrino mass of ∼0.1 eV, and a future high precision 21 cm lensing survey would allow the normal hierarchy and inverted hierarchy cases to be distinguished, and even the right-handed Dirac neutrinos may be detectable.
An evaluation of EREP (Skylab) and ERTS imagery for integrated natural resources survey
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator)
1973-01-01
The author has identified the following significant results. An experimental procedure has been devised and is being tested for natural resource surveys to cope with the problems of interpreting and processing the large quantities of data provided by Skylab and ERTS. Some basic aspects of orbital imagery such as scale, the role of repetitive coverage, and types of sensors are being examined in relation to integrated surveys of natural resources and regional development planning. Extrapolation away from known ground conditions, a fundamental technique for mapping resources, becomes very effective when used on orbital imagery supported by field mapping. Meaningful boundary delimitations can be made on orbital images using various image enhancement techniques. To meet the needs of many developing countries, this investigation into the use of satellite imagery for integrated resource surveys involves the analysis of the images by means of standard visual photointerpretation methods.
Aryal, Arjun; Brooks, Benjamin A.; Reid, Mark E.; Bawden, Gerald W.; Pawlak, Geno
2012-01-01
Acquiring spatially continuous ground-surface displacement fields from Terrestrial Laser Scanners (TLS) will allow better understanding of the physical processes governing landslide motion at detailed spatial and temporal scales. Problems arise, however, when estimating continuous displacement fields from TLS point-clouds because reflecting points from sequential scans of moving ground are not defined uniquely, thus repeat TLS surveys typically do not track individual reflectors. Here, we implemented the cross-correlation-based Particle Image Velocimetry (PIV) method to derive a surface deformation field using TLS point-cloud data. We estimated associated errors using the shape of the cross-correlation function and tested the method's performance with synthetic displacements applied to a TLS point cloud. We applied the method to the toe of the episodically active Cleveland Corral Landslide in northern California using TLS data acquired in June 2005–January 2007 and January–May 2010. Estimated displacements ranged from decimeters to several meters and they agreed well with independent measurements at better than 9% root mean squared (RMS) error. For each of the time periods, the method provided a smooth, nearly continuous displacement field that coincides with independently mapped boundaries of the slide and permits further kinematic and mechanical inference. For the 2010 data set, for instance, the PIV-derived displacement field identified a diffuse zone of displacement that preceded by over a month the development of a new lateral shear zone. Additionally, the upslope and downslope displacement gradients delineated by the dense PIV field elucidated the non-rigid behavior of the slide.
Measuring the scatter in the cluster optical richness-mass relation with machine learning
NASA Astrophysics Data System (ADS)
Boada, Steven Alvaro
The distribution of massive clusters of galaxies depends strongly on the total cosmic mass density, the mass variance, and the dark energy equation of state. As such, measures of galaxy clusters can provide constraints on these parameters and even test models of gravity, but only if observations of clusters can lead to accurate estimates of their total masses. Here, we carry out a study to investigate the ability of a blind spectroscopic survey to recover accurate galaxy cluster masses through their line-of- sight velocity dispersions (LOSVD) using probability based and machine learning methods. We focus on the Hobby Eberly Telescope Dark Energy Experiment (HETDEX), which will employ new Visible Integral-Field Replicable Unit Spectrographs (VIRUS), over 420 degree2 on the sky with a 1/4.5 fill factor. VIRUS covers the blue/optical portion of the spectrum (3500 - 5500 A), allowing surveys to measure redshifts for a large sample of galaxies out to z < 0.5 based on their absorption or emission (e.g., [O II], Mg II, Ne V) features. We use a detailed mock galaxy catalog from a semi-analytic model to simulate surveys observed with VIRUS, including: (1) Survey, a blind, HETDEX-like survey with an incomplete but uniform spectroscopic selection function; and (2) Targeted, a survey which targets clusters directly, obtaining spectra of all galaxies in a VIRUS-sized field. For both surveys, we include realistic uncertainties from galaxy magnitude and line-flux limits. We benchmark both surveys against spectroscopic observations with perfect" knowledge of galaxy line-of-sight velocities. With Survey observations, we can recover cluster masses to ˜ 0.1 dex which can be further improved to < 0.1 dex with Targeted observations. This level of cluster mass recovery provides important measurements of the intrinsic scatter in the optical richness-cluster mass relation, and enables constraints on the key cosmological parameter, sigma 8, to < 20%. As a demonstration of the methods developed previously, we present a pilot survey with integral field spectroscopy of ten galaxy clusters optically selected from the Sloan Digital Sky Survey's DR8 at z = 0.2 - 0.3. Eight of the clusters are rich (lambda > 60) systems with total inferred masses (1.58 -17.37) x1014 M (M 200c), and two are poor (lambda < 15) systems with inferred total masses ˜ 0.5 x 1014 M? (M200c ). We use the Mitchell Spectrograph, (formerly the VIRUS-P spectrograph, a prototype of the HETDEX VIRUS instrument) located on the McDonald Observatory 2.7m telescope, to measure spectroscopic redshifts and line-of-sight velocities of the galaxies in and around each cluster, determine cluster membership and derive LOSVDs. We test both a LOSVD-cluster mass scaling relation and a machine learning based approach to infer total cluster mass. After comparing the cluster mass estimates to the literature, we use these independent cluster mass measurements to estimate the absolute cluster mass scale, and intrinsic scatter in the optical richness-mass relationship. We measure the intrinsic scatter in richness at fixed cluster mass to be sigmaM/lambda = 0.27 +/- 0.07 dex in excellent agreement with previous estimates of sigmaM/lambda ˜ 0.2 - 0.3 dex. We discuss the importance of the data used to train the machine learning methods and suggest various strategies to import the accuracy of the bias (offset) and scatter in the optical richness-cluster mass relation. This demonstrates the power of blind spectroscopic surveys such as HETDEX to provide robust cluster mass estimates which can aid in the determination of cosmological parameters and help to calibrate the observable-mass relation for future photometric large area-sky surveys.
NASA Astrophysics Data System (ADS)
Song, X. P.; Potapov, P.; Adusei, B.; King, L.; Khan, A.; Krylov, A.; Di Bella, C. M.; Pickens, A. H.; Stehman, S. V.; Hansen, M.
2016-12-01
Reliable and timely information on agricultural production is essential for ensuring world food security. Freely available medium-resolution satellite data (e.g. Landsat, Sentinel) offer the possibility of improved global agriculture monitoring. Here we develop and test a method for estimating in-season crop acreage using a probability sample of field visits and producing wall-to-wall crop type maps at national scales. The method is first illustrated for soybean cultivated area in the US for 2015. A stratified, two-stage cluster sampling design was used to collect field data to estimate national soybean area. The field-based estimate employed historical soybean extent maps from the U.S. Department of Agriculture (USDA) Cropland Data Layer to delineate and stratify U.S. soybean growing regions. The estimated 2015 U.S. soybean cultivated area based on the field sample was 341,000 km2 with a standard error of 23,000 km2. This result is 1.0% lower than USDA's 2015 June survey estimate and 1.9% higher than USDA's 2016 January estimate. Our area estimate was derived in early September, about 2 months ahead of harvest. To map soybean cover, the Landsat image archive for the year 2015 growing season was processed using an active learning approach. Overall accuracy of the soybean map was 84%. The field-based sample estimated area was then used to calibrate the map such that the soybean acreage of the map derived through pixel counting matched the sample-based area estimate. The strength of the sample-based area estimation lies in the stratified design that takes advantage of the spatially explicit cropland layers to construct the strata. The success of the mapping was built upon an automated system which transforms Landsat images into standardized time-series metrics. The developed method produces reliable and timely information on soybean area in a cost-effective way and could be implemented in an operational mode. The approach has also been applied for other crops in other regions, such as winter wheat in Pakistan, soybean in Argentina and soybean in the entire South America. Similar levels of accuracy and timeliness were achieved as in the US.
2017-04-01
ER D C/ CH L TR -1 7- 5 Coastal Field Data Collection Program Collection, Processing, and Accuracy of Mobile Terrestrial Lidar Survey ... Survey Data in the Coastal Environment Nicholas J. Spore and Katherine L. Brodie Field Research Facility U.S. Army Engineer Research and Development...value to a mobile lidar survey may misrepresent some of the spatially variable error throughout the survey , and further work should incorporate full