Sample records for quantitative survey design

  1. MARSAME Develop A Survey Design 4 DEVELOP A SURVEY DESIGN

    E-print Network

    MARSAME Develop A Survey Design 4 DEVELOP A SURVEY DESIGN 4.1 Introduction Once a decision rule has been developed, a disposition survey can be designed for the impacted materials and equipment (M costly and time-consuming development of redundant survey designs. The evaluation of existing SOPs

  2. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris (Sandia National Laboratories, Carlsbad, NM)

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  3. Adaptive Survey Design Andrew Sage

    E-print Network

    Carriquiry, Alicia

    in Surveys Nonresponse is a serious issue in sample surveys and can cause bias in survey estimatesAdaptive Survey Design Andrew Sage Iowa State University Center for Survey Statistics and Methodology March 14, 2014 Andrew Sage Iowa State University CSSM March 14, 2014 1 / 35 #12;Nonresponse Bias

  4. Survey Design - How to Begin Your Survey Design Project

    NSDL National Science Digital Library

    Creative Research Systems

    This site, produced by Creative Research Systems, takes the user through the steps and decisions necessary when designing a survey. Pros and cons for each method are outlined and other issues in survey design are presented. The site does contain products which need to be purchased, such as a survey done by the corporation, but the page still contains a great deal of free resources which can be applied to conducting your own survey.

  5. Report on Solar Water Heating Quantitative Survey

    SciTech Connect

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  6. A Split Questionnaire Survey Design

    Microsoft Academic Search

    Trivellore E. Raghunathan; James E. Grizzle

    1995-01-01

    This article develops a survey design where the questionnaire is split into components and individuals are administered the varying subsets of the components. A multiple imputation method for analyzing data from this design is developed, in which the imputations are created by random draws from the posterior predictive distribution of the missing parts, given the observed parts by using Gibbs

  7. FPGA Design Automation: A Survey

    Microsoft Academic Search

    Deming Chen; Jason Cong; Peichen Pan

    2006-01-01

    Design automation or computer-aided design (CAD) for field pro- grammable gate arrays (FPGAs) has played a critical role in the rapid advancement and adoption of FPGA technology over the past two decades. The purpose of this paper is to meet the demand for an up-to- date comprehensive survey\\/tutorial for FPGA design automation, with an emphasis on the recent developments within

  8. TRANSPORTATION TOMORROW SURVEY DESIGN AND CONDUCT OF THE SURVEY

    E-print Network

    Toronto, University of

    TRANSPORTATION TOMORROW SURVEY 1996 DESIGN AND CONDUCT OF THE SURVEY FIRST REPORT OF THE 1996 SERIES #12;TRANSPORTATION TOMORROW SURVEY 1996 A Telephone Interview Survey on Household Travel Behaviour that planned and directed the 1996 survey. The people who served on the technical committee were: Tom Appa

  9. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  10. Watershed-based survey designs.

    PubMed

    Detenbeck, Naomi E; Cincotta, Dan; Denver, Judith M; Greenlee, Susan K; Olsen, Anthony R; Pitchford, Ann M

    2005-04-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs. PMID:15861987

  11. Construction of response surface designs for qualitative and quantitative factors

    Microsoft Academic Search

    C. F. J. Wu; Yuan Ding

    1998-01-01

    A general approach is proposed for constructing response surface designs of economical size for qualitative and quantitative factors. It starts with an efficient design (e.g. central composite design) for the quantitative factors and then partitions the design points into groups corresponding to different level combinations of the qualitative factors. Good designs are selected to ensure high estimation efficiency for models

  12. Despite their utility, trawl surveys can-not obtain quantitative samples from

    E-print Network

    bias in trawl surveys Thomas Jagielo Annette Hoffmann Jack Tagart Washington Department of Fish types as the trawl-survey habitat-bias. The trawl-survey habitat-bias may be substantial on the west545 Despite their utility, trawl surveys can- not obtain quantitative samples from rough, rocky

  13. MARSAME Implement The Survey Design 5 IMPLEMENT THE SURVEY DESIGN

    E-print Network

    of potential hazards. Personnel must be trained with regard to potential physical and chemical safety hazards the implementation of MARSAME disposition surveys. The focus of minimizing hazards is shifted away from environmental hazards (e.g., confined spaces, unstable surfaces, heat and cold stress) and towards scenarios where

  14. AUV survey design applied to oceanic deep convection

    Microsoft Academic Search

    J. Scott Willcox; Yanwu Zhang; J. G. Bellingham; J. Marshall

    1996-01-01

    Oceanic processes are characterized by both temporal evolution and spatial variability. For surveys carried out using autonomous underwater vehicles (AUVs), compromises between resolution, total survey time, total survey area, and vehicle speed must be made. In this paper, quantitative tools for optimizing surveys are demonstrated in the context of mapping open-ocean deep convection. The survey performance is measured by a

  15. Approximations for Quantitative Feedback Theory Designs

    NASA Technical Reports Server (NTRS)

    Henderson, D. K.; Hess, R. A.

    1997-01-01

    The computational requirements for obtaining the results summarized in the preceding section were very modest and were easily accomplished using computer-aided control system design software. Of special significance is the ability of the PDT to indicate a loop closure sequence for MIMO QFT designs that employ sequential loop closure. Although discussed as part of a 2 x 2 design, the PDT is obviously applicable to designs with a greater number of inputs and system responses.

  16. Strategies for joint geophysical survey design

    NASA Astrophysics Data System (ADS)

    Shakas, Alexis; Maurer, Hansruedi

    2015-04-01

    In recent years, the use of multiple geophysical techniques to image the subsurface has become a popular option. Joint inversions of geophysical datasets are based on the assumption that the spatial variations of the different physical subsurface parameters exhibit structural similarities. In this work, we combine the benefits of joint inversions of geophysical datasets with recent innovations in optimized experimental design. These techniques maximize the data information content while minimizing the data acquisition costs. Experimental design has been used in geophysics over the last twenty years, but it has never been attempted to combine various geophysical imaging methods. We combine direct current geoelectrics, magnetotellurics and seismic refraction travel time tomography data to resolve synthetic 1D layered Earth models. An initial model for the subsurface structure can be taken from a priori geological information and an optimal joint geophysical survey can be designed around the initial model. Another typical scenario includes an existing data set from a past survey and a subsequent survey that is planned to optimally complement the existing data. Our results demonstrate that the joint design methodology provides optimized combinations of data sets that include only a few data points. Nevertheless, they allow constraining the subsurface models equally well as data from a densely sampled survey. Furthermore, we examine the dependency of optimized survey design on the a priori model assumptions. Finally, we apply the methodology to geoelectric and seismic field data collected along 2D profiles.

  17. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  18. 76 FR 27384 - Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    ...Quantitative Surveys) Under OMB Review AGENCY: Veterans Health Administration, Department of Veterans Affairs. ACTION: Notice...3501-21), this notice announces that the Veterans Health Administration (VHA), Department of...

  19. A Survey of Quantitative Team Performance Metrics for Human-Robot Collaboration

    E-print Network

    Akin, David

    A Survey of Quantitative Team Performance Metrics for Human-Robot Collaboration Sharon M. Singer of the field of collaborative human and robot team performance metric models, and examines existing overall team quantitative performance models to determine which are more applicable to future human and robotic

  20. Quantitative design and evaluation of enhancement\\/thresholding edge detectors

    Microsoft Academic Search

    I. E. Abdou; W. K. Pratt

    1979-01-01

    Quantitative design and performance evaluation techniques are developed for the enhancement\\/thresholding class of image edge detectors. The design techniques are based on statistical detection theory and deterministic pattern-recognition classification procedures. The performance evaluation methods developed include: a)deterministic measurement of the edge gradient amplitude; b)comparison of the probabilities of correct and false edge detection; and c) figure of merit computation. The

  1. Career Plans Survey Report PennDesign Class of 2010

    E-print Network

    Plotkin, Joshua B.

    Career Plans Survey Report PennDesign Class of 2010 The following is based on Career Services' annual survey of graduating School of Design students. Career Services surveyed students the survey either online or in the mail, 101 students responded ­ a 36% response rate. Where we could, we

  2. 1988 Schools and Staffing Survey Sample Design and Estimation. Schools and Staffing Survey. Technical Report.

    ERIC Educational Resources Information Center

    Kaufman, Steven

    The Schools and Staffing Survey (SASS) represents the first time the National Center for Education Statistics has integrated three of the Elementary and Secondary Education Surveys: the Teacher Demand and Shortage Surveys, Public and Private School Surveys, and Teacher Surveys. The SASS was designed to measure the critical aspects of teacher…

  3. Optimal design of focused experiments and surveys

    NASA Astrophysics Data System (ADS)

    Curtis, Andrew

    1999-10-01

    Experiments and surveys are often performed to obtain data that constrain some previously underconstrained model. Often, constraints are most desired in a particular subspace of model space. Experiment design optimization requires that the quality of any particular design can be both quantified and then maximized. This study shows how the quality can be defined such that it depends on the amount of information that is focused in the particular subspace of interest. In addition, algorithms are presented which allow one particular focused quality measure (from the class of focused measures) to be evaluated efficiently. A subclass of focused quality measures is also related to the standard variance and resolution measures from linearized inverse theory. The theory presented here requires that the relationship between model parameters and data can be linearized around a reference model without significant loss of information. Physical and financial constraints define the space of possible experiment designs. Cross-well tomographic examples are presented, plus a strategy for survey design to maximize information about linear combinations of parameters such as bulk modulus, ? =?+ 2?/3.

  4. Submission ID Number STM #172 The Design Conjecture: A Survey

    E-print Network

    Hein, Derek

    that all ­designs can be constructed in this fashion. This paper is a survey of the current (2006) status Definition 1.4. Any ­design E obtained in this fashion from a symmetric design D is called a type­1 ­designSubmission ID Number STM #172 The ­Design Conjecture: A Survey Derek W. Hein Southern Utah

  5. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  6. Design of future surveys: chapter 13

    USGS Publications Warehouse

    Bart, Jonathan; Smith, Paul A.

    2012-01-01

    This brief chapter addresses two related issues: how effort should be allocated to different parts of the sampling plan and, given optimal allocation, how large a sample will be required to achieve the PRISM accuracy target. Simulations based on data collected to date showed that 2 plots per cluster on rapid surveys, 2 intensive camps per field crew-year, 2-4 intensive plots per intensive camp, and 2-3 rapid surveys per intensive plot is the most efficient allocation of resources. Using this design, we investigated how crew-years should be allocated to each region in order to meet the PRISM accuracy target most efficiently. The analysis indicated that 40-50 crew-years would achieve the accuracy target for 18-24 of the 26 species breeding widely in the Arctic. This analysis was based on assuming that two rounds of surveys were conducted and that a 50% decline occurred between them. We discuss the complexity of making these estimates and why they should be viewed as first approximations.

  7. LSST Survey Strategy: Cadence Design and Simulation

    NASA Astrophysics Data System (ADS)

    Pinto, Philip A.; Cook, K. H.; Delgado, F.; Miller, M.; Denneau, L.; Saha, A.; Gee, P. A.; Tyson, J. A.; Ivezic, Z.; LSST Collaboration

    2006-12-01

    The LSST will allow a wide variety of science to be done using data from a single survey. A large part of ensuring this claim is designing a smart and adaptive algorithm for scheduling observations, one which can effectively merge multiple requirements into a single program of observations while maximizing time on the sky and coping effectively with changing conditions in real time. Diverse requirements include multiband imaging of 30,000 square degrees of sky, achieving a uniform depth of exposure across 20,000 square degrees in each of six filters, allowing effective search strategies for NEO's and shortand long-period variables, and providing frequent, deep exposures to characterize faint transients and moving objects. The LSST operations simulator includes a detailed model of seeing and sky transparency derived from data obtained at its site on Cerro Pachon, Chile. It also includes a detailed model of the delays incurred by readout of the camera, filter changes, active optics acquisition, and movements of the dome and telescope. We describe current progress in the LSST scheduler design and present simulations of a prototype ten-year LSST mission which demonstrate that all of the science requirements and constraints can be accomodated successfully into a single survey.

  8. Quantitative survey on the shape of the back of men's head as viewed from the side.

    PubMed

    Tamir, Abraham

    2013-05-01

    This article classifies quantitatively into 4 shapes men's back part of the head viewed from the side that are demonstrated in some of the figures in this article. Because of self-evident reasons, the shapes were blurred. The survey is based on the analysis of 2220 shapes obtained by photographing mainly bald men and by finding pictures in the Internet. To the best of the author's knowledge, this quantitative approach has never been implemented before. The results obtained are as follows: the percentage of 376 "flat heads" is 17%; the percentage of 755 "little round heads," 34%; the percentage of 1017 "round heads," 45.8%; and the percentage of 72 "very round heads," 3.2%. This quantitative survey is an additional step in analyzing quantitatively the shape of the parts of the face wherein, in articles that were previously published or that will be published in this magazine, shapes of the nose, ear conch, and human eye were analyzed quantitatively. In addition, the shapes of the leg toes were also analyzed. Finally, it should be noted that, because of obvious reasons, the survey is based on men's head, most of which are with baldness. PMID:23714907

  9. Theory of Model-Based Geophysical Survey and Experimental Design

    E-print Network

    resistivity survey designs in real-time as new data, and hence new information are acquired (Stu1 Theory of Model-Based Geophysical Survey and Experimental Design Part A ­ Linear Problems Andrew logging, electromagnetic, earthquake monitoring and micro-seismic surveys, and in laboratory

  10. Online Survey Design and Development: A Janus-Faced Approach

    ERIC Educational Resources Information Center

    Lauer, Claire; McLeod, Michael; Blythe, Stuart

    2013-01-01

    In this article we propose a "Janus-faced" approach to survey design--an approach that encourages researchers to consider how they can design and implement surveys more effectively using the latest web and database tools. Specifically, this approach encourages researchers to look two ways at once; attending to both the survey interface…

  11. Responsive design for household surveys: tools for actively controlling survey errors and costs

    Microsoft Academic Search

    Robert M. Groves; Steven G. Heeringa

    2006-01-01

    Over the past few years surveys have expanded to new populations, have incorporated measurement of new and more complex substantive issues and have adopted new data collection tools. At the same time there has been a growing reluctance among many household populations to participate in surveys. These factors have combined to present survey designers and survey researchers with increased uncertainty

  12. Workshop on Research and Survey Design in Environmental Politics

    E-print Network

    Wehrli, Bernhard

    Workshop on Research and Survey Design in Environmental Politics 1 March 2012, University of Bern ....................................................................................................................................................... The Workshop on Research and Survey Design in Environmental Politics is designed to address practical questions using the freeware UCINET. Mark Lubell, Department of Environmental Science and Policy, University

  13. Survey of rural, private wells. Statistical design

    USGS Publications Warehouse

    Mehnert, Edward; Schock, Susan C.

    1991-01-01

    Half of Illinois' 38 million acres were planted in corn and soybeans in 1988. On the 19 million acres planted in corn and soybeans, approximately 1 million tons of nitrogen fertilizer and 50 million pounds of pesticides were applied. Because groundwater is the water supply for over 90 percent of rural Illinois, the occurrence of agricultural chemicals in groundwater in Illinois is of interest to the agricultural community, the public, and regulatory agencies. The occurrence of agricultural chemicals in groundwater is well documented. However, the extent of this contamination still needs to be defined. This can be done by randomly sampling wells across a geographic area. Key elements of a random, water-well sampling program for regional groundwater quality include the overall statistical design of the program, definition of the sample population, selection of wells to be sampled, and analysis of survey results. These elements must be consistent with the purpose for conducting the program; otherwise, the program will not provide the desired information. The need to carefully design and conduct a sampling program becomes readily apparent when one considers the high cost of collecting and analyzing a sample. For a random sampling program conducted in Illinois, the key elements, as well as the limitations imposed by available information, are described.

  14. Design and Validation of the Quantum Mechanics Conceptual Survey

    ERIC Educational Resources Information Center

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included…

  15. Quantitative performance-based evaluation of a procedure for flexible design concept generation

    E-print Network

    Cardin, Michel-Alexandre, 1979-

    2011-01-01

    This thesis presents an experimental methodology for objective and quantitative design procedure evaluation based on anticipated lifecycle performance of design concepts, and a procedure for flexible design concept generation. ...

  16. Designing community surveys to provide a basis for noise policy

    NASA Technical Reports Server (NTRS)

    Fields, J. M.

    1980-01-01

    After examining reports from a large number of social surveys, two areas were identified where methodological improvements in the surveys would be especially useful for public policy. The two study areas are: the definition of noise indexes and the assessment of noise impact. Improvements in the designs of surveys are recommended which would increase the validity and reliability of the noise indexes. Changes in interview questions and sample designs are proposed which would enable surveys to provide measures of noise impact which are directly relevant for public policy.

  17. A survey of pipe routing design

    Microsoft Academic Search

    Xiao-long Qian; Tao Ren; Cheng-en Wang

    2008-01-01

    In this paper, we firstly recall some recent contributions of pipe routing design which are categorized into four fields: factory layout, circuit layout, aircraft design and ship piping system design. The main constrains in this filed are then discussed. Lastly, a brief overview of the major design approaches is provided.

  18. Survey of quantitative data on the solar energy and its spectra distribution

    NASA Technical Reports Server (NTRS)

    Thekaekara, M. P.

    1976-01-01

    This paper presents a survey of available quantitative data on the total and spectral solar irradiance at ground level and outside the atmosphere. Measurements from research aircraft have resulted in the currently accepted NASA/ASTM standards of the solar constant and zero air mass solar spectral irradiance. The intrinsic variability of solar energy output and programs currently under way for more precise measurements from spacecraft are discussed. Instrumentation for solar measurements and their reference radiation scales are examined. Insolation data available from the records of weather stations are reviewed for their applicability to solar energy conversion. Two alternate methods of solarimetry are briefly discussed.

  19. PiMA Survey Design and Methodology

    E-print Network

    Mudhai, Okoth Fred; Abreu Lopes, Claudia; Mitullah, Winnie; Fraser, Alastair; Milapo, Nalukui; Mwangi, Sammy; (PI) Srinivasan, Sharath

    2015-06-23

    in the capital city Nairobi, with mixed demographics including one of the city’s major slums; and Seme: a rural constituency settled around Lake Victoria in a largely !sher-agricultural community in the western Kenyan city of Kisumu. In Zambia, the surveys were...

  20. Employee Interest Survey This employee interest survey is designed to assess employee interests in worksite wellness

    E-print Network

    Employee Interest Survey This employee interest survey is designed to assess employee interests plan a wellness program that meets the needs of employees at your worksite. All information to control my weight 3. Learning more about the benefits of physical activity 4. Engaging in a walking

  1. Split Questionnaire Design for Massive Surveys

    Microsoft Academic Search

    Feray Adigüzel; Michel Wedel

    2008-01-01

    Generating Between-Block Designs We start describing the procedure that is used to generate the between-block designs. We assume that if there are N individuals and Q questions, then N\\/K individuals will be assigned randomly to each of the K splits. Each alternative split questionnaire design then consists of an N x Q matrix D with K different split patterns. Each

  2. Lack of quantitative training among early-career ecologists: a survey of the problem and potential solutions

    PubMed Central

    Ezard, Thomas H.G.; Jørgensen, Peter S.; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J.; Poisot, Timothée

    2014-01-01

    Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was “too low” in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862

  3. Lack of quantitative training among early-career ecologists: a survey of the problem and potential solutions.

    PubMed

    Barraquand, Frédéric; Ezard, Thomas H G; Jørgensen, Peter S; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J; Poisot, Timothée

    2014-01-01

    Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was "too low" in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862

  4. A Survey of Agent Designs for TAC SCM Wolfgang Ketter

    E-print Network

    Ketter, Wolfgang

    A Survey of Agent Designs for TAC SCM Wolfgang Ketter Dept of DIS RSM Erasmus University Rotterdam approaches among the competitors in the Trading Agent Competition for Supply Chain Management (TAC SCM). We identify the problem of decision coordination as a crucial el- ement in the design of an agent for TAC SCM

  5. A Survey and Taxonomy of GALS Design Styles

    E-print Network

    Lemieux, Guy

    A Survey and Taxonomy of GALS Design Styles Paul Teehan, Mark Greenstreet, and Guy Lemieux find the concepts and taxonomy presented here very useful. --Sandeep Shukla, Virginia Tech Figure 1. In this article, we describe some design examples and introduce our taxonomy of these techniques. Taxonomy

  6. The Dark Energy Survey instrument design

    NASA Astrophysics Data System (ADS)

    Flaugher, B.

    2006-06-01

    We describe a new project, the Dark Energy Survey (DES), aimed at measuring the dark energy equation of state parameter, w, to a statistical precision of ~5%, with four complementary techniques. The survey will use a new 3 sq. deg. mosaic camera (DECam) mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic camera, a five element optical corrector, four filters (g,r,i,z), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the 2.2 deg. diameter field of view. We plan to use the 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). At Fermilab, we will establish a packaging factory to produce four-side buttable modules for the LBNL devices, as well as to test and grade the CCDs. R&D is underway and delivery of DECam to CTIO is scheduled for 2009.

  7. The Dark Energy Survey instrument design

    SciTech Connect

    Flaugher, B.; /Fermilab

    2006-05-01

    We describe a new project, the Dark Energy Survey (DES), aimed at measuring the dark energy equation of state parameter, w, to a statistical precision of {approx}5%, with four complementary techniques. The survey will use a new 3 sq. deg. mosaic camera (DECam) mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic camera, a five element optical corrector, four filters (g,r,i,z), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27''/pixel) arranged in a hexagon inscribed within the 2.2 deg. diameter field of view. We plan to use the 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). At Fermilab, we will establish a packaging factory to produce four-side buttable modules for the LBNL devices, as well as to test and grade the CCDs. R&D is underway and delivery of DECam to CTIO is scheduled for 2009.

  8. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...false Post-approval alterations to survey design. 1340.11 Section 1340...UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements...11 Post-approval alterations to survey design. After NHTSA approval of...

  9. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...false Post-approval alterations to survey design. 1340.11 Section 1340...UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements...11 Post-approval alterations to survey design. After NHTSA approval of...

  10. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...false Post-approval alterations to survey design. 1340.11 Section 1340...UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements...11 Post-approval alterations to survey design. After NHTSA approval of...

  11. Acoustic Surveys of Methane Plumes by Quantitative Echo Sounder in the Eastern Margin of Japan Sea

    NASA Astrophysics Data System (ADS)

    Aoyama, C.; Matsumoto, R.

    2009-12-01

    During methane hydrate exploration and research, remote and on board acoustic surveying and monitoring of methane hydrate can be easily and economically conducted using a quantitative echo sounder.Simultaneously, the structure and the floating up speed of methane plumes can be obtained from an analysis of acoustic data.We conducted a survey of methane plumes from 2004 through 2008 at a spur situated southwest off the coast of Sado Island, tentatively called Umitaka Spur and at the Joetsu Knoll.In 2007 and 2008, we performed experiments by releasing methane hydrate bubbles and methane hydrate, and letting them float upward. Consequently, we demonstrated that acoustical reflection from the methane plumes correlates with water temperature and depth, that the floating up speed is constant but depends on the conditions of methane hydrate, that the discharge of methane hydrate bubbles changes, and that there is a wide scattering of materials below the seafloor where methane plumes are located.The method will be applied not only to basic research on methane hydrate but also to assessments of the environmental impact of methane hydrate exploitation.

  12. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Ly{alpha} NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    SciTech Connect

    Prescott, Moire K. M. [Department of Physics, Broida Hall, Mail Code 9530, University of California, Santa Barbara, CA 93106 (United States); Dey, Arjun; Jannuzi, Buell T., E-mail: mkpresco@physics.ucsb.edu [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States)

    2012-04-01

    Giant Ly{alpha} nebulae (or Ly{alpha} 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Ly{alpha} nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Ly{alpha} nebulae at 2 {approx}< z {approx}< 3 within deep broadband imaging and have carried out a survey of the 9.4 deg{sup 2} NOAO Deep Wide-Field Survey Booetes field. With a total survey comoving volume of Almost-Equal-To 10{sup 8} h{sup -3}{sub 70} Mpc{sup 3}, this is the largest volume survey for Ly{alpha} nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Ly{alpha} nebula.

  13. The Design of a Quantitative Western Blot Experiment

    PubMed Central

    Taylor, Sean C.; Posch, Anton

    2014-01-01

    Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013) and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting. PMID:24738055

  14. The design of a quantitative western blot experiment.

    PubMed

    Taylor, Sean C; Posch, Anton

    2014-01-01

    Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013) and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting. PMID:24738055

  15. Towards Quantitative Time Domain Design Tradeoffs in Nonlinear Control

    E-print Network

    Braslavsky, Julio H.

    survey paper [11], this evolution, during the 1990s, was marked by the transformation, for certain classes of nonminimum phase systems, the closed loop transient step response must display, Performance analysis, Cheap control, Quadratic optimal regulators, Non-minimum phase systems. 1 Introduction

  16. Design of part family robust-to-production plan variations based on quantitative manufacturability evaluation

    E-print Network

    Saitou, Kazuhiro "Kazu"

    , a method is proposed to design product families that are robust to production plan variations, basedDesign of part family robust-to-production plan variations based on quantitative manufacturability evaluation Byungwoo Lee, Kazuhiro Saitou Abstract This paper presents a systematic method for designing part

  17. Multidisciplinary aerospace design optimization: Survey of recent developments

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1995-01-01

    The increasing complexity of engineering systems has sparked increasing interest in multidisciplinary optimization (MDO). This paper presents a survey of recent publications in the field of aerospace where interest in MDO has been particularly intense. The two main challenges of MDO are computational expense and organizational complexity. Accordingly the survey is focussed on various ways different researchers use to deal with these challenges. The survey is organized by a breakdown of MDO into its conceptual components. Accordingly, the survey includes sections on Mathematical Modeling, Design-oriented Analysis, Approximation Concepts, Optimization Procedures, System Sensitivity, and Human Interface. With the authors' main expertise being in the structures area, the bulk of the references focus on the interaction of the structures discipline with other disciplines. In particular, two sections at the end focus on two such interactions that have recently been pursued with a particular vigor: Simultaneous Optimization of Structures and Aerodynamics, and Simultaneous Optimization of Structures Combined With Active Control.

  18. Evolvable Systems in Hardware Design: Taxonomy, Survey and Applications

    Microsoft Academic Search

    Ricardo Salem Zebulum; Marco Aurélio Cavalcanti Pacheco; Marley B. R. Vellasco

    1996-01-01

    This article proposes a taxonomy, presents a survey and describes a set of applications on Evolvable Hardware Systems (EHW). The taxonomy is based on the following properties: Hardware Evaluation Process, Evolutionary Approach, Target Application Area and Evolving Platform. Recent reported applications on EHW are also reviewed, according to the proposed taxonomy. Additionally, a set of digital design applications, developed by

  19. Particle design using supercritical fluids: Literature and patent survey

    Microsoft Academic Search

    Jennifer Jung; Michel Perrut

    2001-01-01

    As particle design is presently a major development of supercritical fluids applications, mainly in the pharmaceutical, nutraceutical, cosmetic and specialty chemistry industries, number of publications are issued and numerous patents filed every year. This document presents a survey (that cannot pretend to be exhaustive!) of published knowledge classified according to the different concepts currently used to manufacture particles, microspheres or

  20. Application of load survey systems to proper tariff design

    Microsoft Academic Search

    C. S. Chen; J. C. Hwang; C. W. Huang

    1997-01-01

    This paper proposes a proper rate making strategy for a public owned utility by taking into account the customer load characteristics. The load survey system has been well designed by sampling theory to find the customers for power consumption information collection. By this manner, the typical load patterns derived can effectively represent the load behavior of each customer class. The

  1. Multiwavelength CO2 DIAL system designed for quantitative concentration measurement

    Microsoft Academic Search

    Joseph Leonelli; Jan van der Laan; Peter Holland; Leland Fletcher; Russell Warren

    1990-01-01

    A multiwavelength CO2 direct-detection differential absorption lidar (DIAL) system capable of providing range-resolved vapor-concentration contour plots of a 1000 sq m grid at 20-m spatial resolution in 10-s intervals is reported. Design goals are outlined along with system specifications. The self-contained mobile system is modular in design and can detect, identify, quantify, and map chemical vapor clouds having significant spectral

  2. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  3. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    SciTech Connect

    Ivezic, Z.; Axelrod, T.; Brandt, W.N.; Burke, D.L.; Claver, C.F.; Connolly, A.; Cook, K.H.; Gee, P.; Gilmore, D.K.; Jacoby, S.H.; Jones, R.L.; Kahn, S.M.; Kantor, J.P.; Krabbendam, V.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Saha, A.; Schalk, T.L.; Schneider, D.P.; Strauss, Michael A.; /Washington U., Seattle, Astron. Dept. /LSST Corp. /Penn State U., Astron. Astrophys. /KIPAC, Menlo Park /NOAO, Tucson /LLNL, Livermore /UC, Davis /Princeton U., Astrophys. Sci. Dept. /Naval Observ., Flagstaff /Arizona U., Astron. Dept. - Steward Observ. /UC, Santa Cruz /Harvard U. /Johns Hopkins U. /Illinois U., Urbana

    2011-10-14

    In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST). LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg{sup 2} field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg{sup 2} with {delta} < +34.5{sup o}, and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the LSST science drivers led to these choices of system parameters.

  4. Assessing usual dietary intake in complex sample design surveys: the National Dietary Survey.

    PubMed

    Barbosa, Flávia dos Santos; Sichieri, Rosely; Junger, Washington Leite

    2013-02-01

    The National Cancer Institute (NCI) method allows the distributions of usual intake of nutrients and foods to be estimated. This method can be used in complex surveys. However, the user must perform additional calculations, such as balanced repeated replication (BRR), in order to obtain standard errors and confidence intervals for the percentiles and mean from the distribution of usual intake. The objective is to highlight adaptations of the NCI method using data from the National Dietary Survey. The application of the NCI method was exemplified analyzing the total energy (kcal) and fruit (g) intake, comparing estimations of mean and standard deviation that were based on the complex design of the Brazilian survey with those assuming simple random sample. Although means point estimates were similar, estimates of standard error using the complex design increased by up to 60% compared to simple random sample. Thus, for valid estimates of food and energy intake for the population, all of the sampling characteristics of the surveys should be taken into account because when these characteristics are neglected, statistical analysis may produce underestimated standard errors that would compromise the results and the conclusions of the survey. PMID:23703261

  5. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  6. On the Design of IEEE Compliant FloatingPoint Units and Their Quantitative Analysis

    E-print Network

    Seidel, Peter-Michael

    On the Design of IEEE Compliant Floating­Point Units and Their Quantitative Analysis Dissertation (FPU) that is fully compliant with the IEEE floating­point standard 754­1985 [19]. There are a few choices that need to be made when designing an IEEE compliant FPU, among them: the internal representation

  7. Further steps towards a quantitative approach to durability design

    Microsoft Academic Search

    Z. Lounis; M. A. Lacasse; K. Moser

    This paper presents further steps in the development of reliability-based approaches for the durability design and service life prediction of building components which integrate the requirements of safety, serviceability and durability. In general, the load and resistance should be modelled as stochastic processes and the resulting durability problem is formulated in a time-dependent probabilistic format. Using the classical reliability approach,

  8. Quantitative structural design of high voltage potted electronic modules

    Microsoft Academic Search

    A. T. Tweedie; P. A. Lieberman

    1981-01-01

    Failure analysis of traveling wave tubes (TWT's) revealed high voltage arc-overs due to cracks in the potting material. It is suggested that the geometric features of the design caused stresses during thermal cycling, which were less than the static strength of the material, but high enough to cause slow crack growth from flaws in the material. The USAF helped sponsor

  9. Quantitative structural design of high voltage potted electronic modules

    NASA Astrophysics Data System (ADS)

    Tweedie, A. T.; Lieberman, P. A.

    Failure analysis of traveling wave tubes (TWT's) revealed high voltage arc-overs due to cracks in the potting material. It is suggested that the geometric features of the design caused stresses during thermal cycling, which were less than the static strength of the material, but high enough to cause slow crack growth from flaws in the material. The USAF helped sponsor a program to investigate this phenomenon in relation to the 40 W 293H TWT, and to develop a new design which would have high reliability with respect to potting compound structural failures. An iterated design-analysis process was used and coupled with life predictions based on an understanding of the fracture mechanics of the materials involved. The fundamental design data and analysis procedures consisted of: (1) materials characterization; (2) stress, dynamics and thermal analysis of the TWT and its redesign; (3) measurement of the rate of crack growth in the potting compound as a function of stress and temperature; and (4) life prediction of the redesigned TWT.

  10. A survey of optimization techniques for integrated-circuit design

    NASA Astrophysics Data System (ADS)

    Brayton, R. K.; Hachtel, G. D.; Sangiovanni-Vincentelli, A. L.

    1981-10-01

    Contemporary optimization techniques are surveyed and related to optimization problems which arise in the design of integrated circuits. Theory, algorithms, and programs are reviewed, and an assessment is made of the impact optimization has had and will have on integrated-circuit design. Integrated circuits are characterized by complex tradeoffs between multiple nonlinear objectives with multiple nonlinear and sometimes nonconvex constraints. Function and gradient evaluations require the solution of very large sets of nonlinear differential equations; consequently they are inaccurate and extremely expensive. Futhermore, the parameters to be optimized are subject to inherent statistical fluctuations. Particular emphasis is given to those multiobjective constrained optimization techniques which are appropriate to this environment.

  11. The HST/ACS Coma Cluster Survey: I - Survey Objectives and Design

    E-print Network

    David Carter; Paul Goudfrooij; Bahram Mobasher; Henry C. Ferguson; Thomas H. Puzia; Alfonso L. Aguerri; Marc Balcells; Dan Batcheldor; Terry J. Bridges; Jonathan I. Davies; Peter Erwin; Alister W. Graham; Rafael Guzmán; Derek Hammer; Ann Hornschemeier; Carlos Hoyos; Michael J. Hudson; Avon Huxor; Shardha Jogee; Yutaka Komiyama; Jennifer Lotz; John R. Lucey; Ronald O. Marzke; David Merritt; Bryan W. Miller; Neal A. Miller; Mustapha Mouhcine; Sadanori Okamura; Reynier F. Peletier; Steven Phillipps; Bianca M. Poggianti; Ray M. Sharples; Russell J. Smith; Neil Trentham; R. Brent Tully; Edwin Valentijn; Gijs Verdoes Kleijn

    2008-01-24

    We describe the HST ACS Coma cluster Treasury survey, a deep two-passband imaging survey of one of the nearest rich clusters of galaxies, the Coma cluster (Abell 1656). The survey was designed to cover an area of 740 square arcmin in regions of different density of both galaxies and intergalactic medium within the cluster. The ACS failure of January 27th 2007 leaves the survey 28% complete, with 21 ACS pointings (230 square arcmin) complete, and partial data for a further 4 pointings (44 square arcmin). Predicted survey depth for 10 sigma detections for optimal photometry of point sources is g' = 27.6 in the F475W filter, and IC=26.8 mag in F814 (AB magnitudes). Initial simulations with artificially injected point sources show 90% recovered at magnitude limits of g' = 27.55 and IC = 26.65. For extended sources, the predicted 10 sigma limits for a 1 square arcsecond region are g' = 25.8 mag/sq. arcsec and IC = 25.0 mag/sq. arcsec. We highlight several motivating science goals of the survey, including study of the faint end of the cluster galaxy luminosity function, structural parameters of dwarf galaxies, stellar populations and their effect on colors and color gradients, evolution of morphological components in a dense environment, the nature of ultra compact dwarf galaxies, and globular cluster populations of cluster galaxies of a range of luminosities and types. This survey will also provide a local rich cluster benchmark for various well known global scaling relations and explore new relations pertaining to the nuclear properties of galaxies.

  12. The WiggleZ Dark Energy Survey: Survey Design and First Data Release

    E-print Network

    Drinkwater, Michael J; Blake, Chris; Woods, David; Pimbblet, Kevin A; Glazebrook, Karl; Sharp, Rob; Pracy, Michael B; Brough, Sarah; Colless, Matthew; Couch, Warrick J; Croom, Scott M; Davis, Tamara M; Forbes, Duncan; Forster, Karl; Gilbank, David G; Gladders, Michael; Jelliffe, Ben; Jones, Nick; Li, I-hui; Madore, Barry; Martin, D Christopher; Poole, Gregory B; Small, Todd; Wisnioski, Emily; Wyder, Ted; Yee, H K C

    2009-01-01

    The WiggleZ Dark Energy Survey is a survey of 240,000 emission line galaxies in the distant universe, measured with the AAOmega spectrograph on the 3.9-m Anglo-Australian Telescope (AAT). The target galaxies are selected using ultraviolet photometry from the GALEX satellite, with a flux limit of NUV<22.8 mag. The redshift range containing 90% of the galaxies is 0.2survey is to precisely measure the scale of baryon acoustic oscillations (BAO) imprinted on the spatial distribution of these galaxies at look-back times of 4-8 Gyrs. Detailed forecasts indicate the survey will measure the BAO scale to better than 2% and the tangential and radial acoustic wave scales to approximately 3% and 5%, respectively. This paper provides a detailed description of the survey and its design, as well as the spectroscopic observations, data reduction, and redshift measurement techniques employed. It also presents an analysis of the properties of the target galaxies, including emission line ...

  13. Model Driven Design and Implementation of Statistical Surveys Chul Hwee Kim1

    E-print Network

    Grundy, John

    Model Driven Design and Implementation of Statistical Surveys Chul Hwee Kim1 , John Hosking1@cs}.auckland.ac.nz Abstract We describe the evolution of a statistical survey design visual language from a standalone design statistical surveys. This involved, firstly, elaboration of the notation to support additional requirements

  14. SEDS: The Spitzer Extended Deep Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Technical Reports Server (NTRS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Arendt, A.; Barmby, P.; Barro, G; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Faber, S.; Finlator, K.; Grogin, N. A.; Guhathakurta, P.; Hernquist, L.; Hora, J. L.; Illingworth, G.; Kashlinsky, A; Koekmoer, A. M.; Koo, D. C.; Moseley, H.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg(exp 2) to a depth of 26 AB mag (3sigma) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 micron. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 +/- 1.0 and 4.4 +/- 0.8 nW / square m/sr at 3.6 and 4.5 micron to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  15. SEDS: THE SPITZER EXTENDED DEEP SURVEY. SURVEY DESIGN, PHOTOMETRY, AND DEEP IRAC SOURCE COUNTS

    SciTech Connect

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Hernquist, L.; Hora, J. L. [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States); Arendt, R. [Observational Cosmology Laboratory, Code 665, Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Barmby, P. [University of Western Ontario, London, ON N6A 3K7 (Canada); Barro, G.; Faber, S.; Guhathakurta, P. [University of California Observatories/Lick Observatory and Department of Astronomy and Astrophysics University of California Santa Cruz, 1156 High St., Santa Cruz, CA 95064 (United States); Bell, E. F. [Department of Astronomy, University of Michigan, 500 Church St., Ann Arbor, MI 48109 (United States); Bouwens, R. [Leiden Observatory, Leiden University, NL-2300 RA Leiden (Netherlands); Cattaneo, A. [Aix Marseille Universite, CNRS, Laboratoire d'Astrophysique de Marseille, UMR 7326, F-13388, Marseille (France); Croton, D. [Centre for Astrophysics and Supercomputing, Swinburne University of Technology, P.O. Box 218 Hawthorn, VIC 3122 (Australia); Dave, R. [Department of Astronomy, University of Arizona, Tucson, AZ 85721 (United States); Dunlop, J. S. [Scottish Universities Physics Alliance, Institute for Astronomy, University of Edinburgh, Royal Observatory, Edinburgh, EH9 3HJ (United Kingdom); Egami, E. [Steward Observatory, University of Arizona, 933 N. Cherry Ave, Tucson, AZ 85721 (United States); Finlator, K. [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, Juliane Maries Vej 30, CK-2100 Copenhagen O (Denmark); Grogin, N. A., E-mail: mashby@cfa.harvard.edu [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); and others

    2013-05-20

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg{sup 2} to a depth of 26 AB mag (3{sigma}) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 {mu}m. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 {+-} 1.0 and 4.4 {+-} 0.8 nW m{sup -2} sr{sup -1} at 3.6 and 4.5 {mu}m to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  16. Hierarchical and factorial mating designs for quantitative genetic analysis in tetrasomic potato.

    PubMed

    Ortiz, R.; Golmirzaie, A.

    2002-03-01

    Plant breeders need to quantify additive and non-additive components of genetic variance in order to determine appropriate selection methods to improve quantitative characteristics. Hierarchical and factorial mating designs (also known as North Carolina mating designs I and II, respectively) allow one to determine these variance components. The relative advantages of these two designs in the quantitative genetics of tuber yield in tetrasomic potato were investigated. Likewise, the number of female parents to include in design I was also investigated. Data were collected from two independent experiments at two contrasting Peruvian locations: La Molina in the dry coast and San Ramon in the humid mid-altitude. In the first experiment, although design I gave a negative digenic variance (sigma(2)(D)), this design provided almost the same estimate of narrow-sense heritability (h(2)) for tuber yield as that obtained in design II (0.291 and 0.260, respectively). Therefore, design I appears to be appropriate for quantitative genetics research in tetrasomic potato, a crop in which some clones are male sterile. The easy handling of crosses (distinct random females included in the crossing scheme) is another advantage of design I relative to design II. In the second experiment, 12 males were crossed with either two or four females following a design-I mating scheme. The additive genetic variance (sigma(2)(A)) was zero (or negative) when two females per male were included but was positive with four females. These results suggest that two females per male may not be enough for design I in tetrasomic potato. Four females per male are preferable to determine sigma(2)(A) in design I for this tetrasomic crop. PMID:12582673

  17. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...designation based on submission of recreational survey data. 600.1417 Section 600.1417...designation based on submission of recreational survey data. (a) To be designated as an...state's participation in a regional survey of marine and anadromous...

  18. Quantitative Feedback Theory (QFT) applied to the design of a rotorcraft flight control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Gorder, P. J.

    1992-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. Quantitative Feedback Theory is applied to the design of the longitudinal flight control system for a linear uncertain model of the AH-64 rotorcraft. In this model, the uncertainty is assigned, and is assumed to be attributable to actual uncertainty in the dynamic model and to the changes in the vehicle aerodynamic characteristics which occur near hover. The model includes an approximation to the rotor and actuator dynamics. The design example indicates the manner in which handling qualities criteria may be incorporated into the design of realistic rotorcraft control systems in which significant uncertainty exists in the vehicle model.

  19. Primer design using Primer Express® for SYBR Green-based quantitative PCR.

    PubMed

    Singh, Amarjeet; Pandey, Girdhar K

    2015-01-01

    To quantitate the gene expression, real-time RT-PCR or quantitative PCR (qPCR) is one of the most sensitive, reliable, and commonly used methods in molecular biology. The reliability and success of a real-time PCR assay depend on the optimal experiment design. Primers are the most important constituents of real-time PCR experiments such as in SYBR Green-based detection assays. Designing of an appropriate and specific primer pair is extremely crucial for correct estimation of transcript abundance of any gene in a given sample. Here, we are presenting a quick, easy, and reliable method for designing target-specific primers using Primer Express(®) software for real-time PCR (qPCR) experiments. PMID:25697658

  20. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…

  1. Quantitative Survey and Structural Classification of Fracking Chemicals Reported in Unconventional Gas Exploitation

    NASA Astrophysics Data System (ADS)

    Elsner, Martin; Schreglmann, Kathrin

    2015-04-01

    Few technologies are being discussed in such controversial terms as hydraulic fracturing ("fracking") in the recovery of unconventional gas. Particular concern regards the chemicals that may return to the surface as a result of hydraulic fracturing. These are either "fracking chemicals" - chemicals that are injected together with the fracking fluid to optimize the fracturing performance or geogenic substances which may turn up during gas production, in the so-called produced water originating from the target formation. Knowledge about them is warranted for several reasons. (1) Monitoring. Air emissions are reported to arise from well drilling, the gas itself or condensate tanks. In addition, potential spills and accidents bear the danger of surface and shallow groundwater contaminations. Monitoring strategies are therefore warranted to screen for "indicator" substances of potential impacts. (2) Chemical Analysis. To meet these analytical demands, target substances must be defined so that adequate sampling approaches and analytical methods can be developed. (3) Transformation in the Subsurface. Identification and classification of fracking chemicals (aromatics vs. alcohols vs. acids, esters, etc.) is further important to assess the possibility of subsurface reactions which may potentially generate new, as yet unidentified transformation products. (4) Wastewater Treatment. For the same reason chemical knowledge is important for optimized wastewater treatment strategies. (5) Human and Ecosystem Health. Knowledge of the most frequent fracking chemicals is further essential for risk assessment (environmental behavior, toxicity) (6) Public Discussions. Finally, an overview of reported fracking chemicals can provide unbiased scientific into current public debates and enable critical reviews of Green Chemistry approaches. Presently, however, such information is not readily available. We aim to close this knowledge gap by providing a quantitative overview of chemical additives reported for use in hydraulic fracturing. For the years 2005-2009 it is based on the Waxman report, and for the years 2011-2013 it relies on the database FracFocus, where it makes use of the data extracted and provided by the website "SkyTruth". For the first time, we list fracking chemicals according to their chemical structure and functional groups, because these properties are important as a starting point for (i) the design of analytical methods, (ii) to assess environmental fate and (iii) to understand why a given chemical is used at a certain stage of the fracturing process and what possible alternatives exist.

  2. Spacecraft drag-free attitude control system design with Quantitative Feedback Theory

    Microsoft Academic Search

    Shu-Fan Wu; Denis Fertin

    2008-01-01

    One of the key technologies to be demonstrated on board the LISA Pathfinder spacecraft (S\\/C) is the drag-free attitude control systems (DFACS), aiming to control the S\\/C attitude and the S\\/C test masses relative motion with a precision of the order of the nanometer. This paper explores how the controllers could be designed and tuned with the Quantitative Feedback Theory

  3. Considerations in Designing Survey Studies and Follow-Up Systems for Special Education Service Programs.

    ERIC Educational Resources Information Center

    Bruininks, Robert H.; And Others

    1990-01-01

    This paper examines considerations for designing a postschool follow-up system in secondary special education, focusing on survey research techniques and special applications of survey methodologies, including data collection techniques, questionnaire construction, sample design and contact, response rates, and tracking procedures. Design and…

  4. Surveys of Need for Office Design and Other Design Skills Among Former Interior Design Graduates and Employers of Designers. Volume IX, No. 15.

    ERIC Educational Resources Information Center

    Daly, Pat; Lucas, John A.

    In order to measure the demand for an office design course among interior design graduates of Harper College and to determine the design skill needs of area employers, two studies were undertaken. An initial mail survey, a second mailing, and a series of telephone calls yielded a 75% response rate from the program's 88 graduates. Of the…

  5. Pragmatic soil survey design using flexible Latin hypercube sampling

    NASA Astrophysics Data System (ADS)

    Clifford, David; Payne, James E.; Pringle, M. J.; Searle, Ross; Butler, Nathan

    2014-06-01

    We review and give a practical example of Latin hypercube sampling in soil science using an approach we call flexible Latin hypercube sampling. Recent studies of soil properties in large and remote regions have highlighted problems with the conventional Latin hypercube sampling approach. It is often impractical to travel far from tracks and roads to collect samples, and survey planning should recognise this fact. Another problem is how to handle target sites that, for whatever reason, are impractical to sample - should one just move on to the next target or choose something in the locality that is accessible? Working within a Latin hypercube that spans the covariate space, selecting an alternative site is hard to do optimally. We propose flexible Latin hypercube sampling as a means of avoiding these problems. Flexible Latin hypercube sampling involves simulated annealing for optimally selecting accessible sites from a region. The sampling protocol also produces an ordered list of alternative sites close to the primary target site, should the primary target site prove inaccessible. We highlight the use of this design through a broad-scale sampling exercise in the Burdekin catchment of north Queensland, Australia. We highlight the robustness of our design through a simulation study where up to 50% of target sites may be inaccessible.

  6. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  7. Antibody Drug Conjugates: Application of Quantitative Pharmacology in Modality Design and Target Selection.

    PubMed

    Sadekar, S; Figueroa, I; Tabrizi, M

    2015-07-01

    Antibody drug conjugates (ADCs) are a multi-component modality comprising of an antibody targeting a cell-specific antigen, a potent drug/payload, and a linker that can be processed within cellular compartments to release payload upon internalization. Numerous ADCs are being evaluated in both research and clinical settings within the academic and pharmaceutical industry due to their ability to selectively deliver potent payloads. Hence, there is a clear need to incorporate quantitative approaches during early stages of drug development for effective modality design and target selection. In this review, we describe a quantitative approach and framework for evaluation of the interplay between drug- and systems-dependent properties (i.e., target expression, density, localization, turnover, and affinity) in order to deliver a sufficient amount of a potent payload into the relevant target cells. As discussed, theoretical approaches with particular considerations given to various key properties for the target and modality suggest that delivery of the payload into particular effect cells to be more sensitive to antigen concentrations for targets with slow turnover rates as compared to those with faster internalization rates. Further assessments also suggest that increasing doses beyond the threshold of the target capacity (a function of target internalization and expression) may not impact the maximum amount of payload delivered to the intended effect cells. This article will explore the important application of quantitative sciences in selection of the target and design of ADC modalities. PMID:25933599

  8. Meta‐Regression Analysis: A Quantitative Method of Literature Surveys

    Microsoft Academic Search

    T. D. Stanley; Stephen B. Jarrell

    2005-01-01

    Abstract.?Pedagogically, literature reviews are instrumental. They summarize the large literature written on a particular topic, give coherence to the complex, often disparate, views expressed about an issue, and serve as a springboard for new ideas. However, literature surveys rarely establish anything approximating unanimous consensus. Ironically, this is just as true for the empirical economic literature. To harmonize this dissonance, we

  9. Meta-Regression Analysis: A Quantitative Method of Literature Survey s

    Microsoft Academic Search

    T D Stanley; Stephen B Jarrell

    1989-01-01

    Pedagogically, literature reviews are instrumental. They summarize the large literature written on a particular topic, give coherence to the complex, often disparate, views expressed about an issue, and serve as a springboard for new ideas. However, literature surveys rarely establish anything approximating unanimous consensus. Ironically, this is just as true for the empirical economic literature. To harmonize this dissonance, we

  10. A Critical Analysis of Interview, Telephone, and Mail Survey Designs.

    ERIC Educational Resources Information Center

    Katz, Elinor

    A critical analysis is presented of the literature as it relates to survey research, including personal interviews, telephone interviews, and mail questionnaires. Additional research concerns are explored, and a code of ethics for survey researchers is presented. Focus groups, interviews, long interviews, telephone interviews, and mail surveys are…

  11. SOME STATISTICAL CONSIDERATIONS OF THE DESIGN OF TRAWL SURVEYS FOR ROCKFISH (SCORPAENIDAE)

    E-print Network

    SOME STATISTICAL CONSIDERATIONS OF THE DESIGN OF TRAWL SURVEYS FOR ROCKFISH (SCORPAENIDAE) WILLIAM statistical theory for choosing among random, stratified random, and systematic sample survey schemes when during apilot trawl survey for rockfish in Queen Charlotte Sound, British Columbia, and a full scale

  12. A quantitative and objective evaluation approach for optimal selection of design concept in conceptual design stage

    E-print Network

    Tiwari, Sanjay

    2002-01-01

    Incorporation of Presented Approach in IIDE Design Framework. . . . . . . . . . . , 122 LIST OF TABLES Page Table I Independence Index as per Type of Design Solution. . Table 2 Extracted Contact and No-Interference Vectors for Example Assembly Layout. . 47... Table 3 Check for Presence of Contact in Example Assembly Layout. . . . . . . . . . . . . . . . . 49 Table 4 Check for Presence of No-Interference in Example Assembly Layout. . . . . . 50 Table 5 Unfeasible Higher Order Subassembly in Example Assembly...

  13. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  14. Rotorcraft flight control design using quantitative feedback theory and dynamic crossfeeds

    NASA Technical Reports Server (NTRS)

    Cheng, Rendy P.

    1995-01-01

    A multi-input, multi-output controls design with robust crossfeeds is presented for a rotorcraft in near-hovering flight using quantitative feedback theory (QFT). Decoupling criteria are developed for dynamic crossfeed design and implementation. Frequency dependent performance metrics focusing on piloted flight are developed and tested on 23 flight configurations. The metrics show that the resulting design is superior to alternative control system designs using conventional fixed-gain crossfeeds and to feedback-only designs which rely on high gains to suppress undesired off-axis responses. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets current handling qualities specifications relative to the decoupling of off-axis responses. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensator successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective.

  15. Quantitative autistic traits ascertained in a national survey of 22 529 Japanese schoolchildren

    PubMed Central

    Kamio, Y; Inada, N; Moriwaki, A; Kuroda, M; Koyama, T; Tsujii, H; Kawakubo, Y; Kuwabara, H; Tsuchiya, K J; Uno, Y; Constantino, J N

    2013-01-01

    Objective Recent epidemiologic studies worldwide have documented a rise in prevalence rates for autism spectrum disorders (ASD). Broadening of diagnostic criteria for ASD may be a major contributor to the rise in prevalence, particularly if superimposed on an underlying continuous distribution of autistic traits. This study sought to determine the nature of the population distribution of autistic traits using a quantitative trait measure in a large national population sample of children. Method The Japanese version of the Social Responsiveness Scale (SRS) was completed by parents on a nationally representative sample of 22 529 children, age 6–15. Results Social Responsiveness Scale scores exhibited a skewed normal distribution in the Japanese population with a single-factor structure and no significant relation to IQ within the normal intellectual range. There was no evidence of a natural ‘cutoff’ that would differentiate populations of categorically affected children from unaffected children. Conclusion This study provides evidence of the continuous nature of autistic symptoms measured by the SRS, a validated quantitative trait measure. The findings reveal how paradigms for diagnosis that rest on arbitrarily imposed categorical cutoffs can result in substantial variation in prevalence estimation, especially when measurements used for case assignment are not standardized for a given population. PMID:23171198

  16. Textile materials for the design of wearable antennas: a survey.

    PubMed

    Salvado, Rita; Loss, Caroline; Gonçalves, Ricardo; Pinho, Pedro

    2012-01-01

    In the broad context of Wireless Body Sensor Networks for healthcare and pervasive applications, the design of wearable antennas offers the possibility of ubiquitous monitoring, communication and energy harvesting and storage. Specific requirements for wearable antennas are a planar structure and flexible construction materials. Several properties of the materials influence the behaviour of the antenna. For instance, the bandwidth and the efficiency of a planar microstrip antenna are mainly determined by the permittivity and the thickness of the substrate. The use of textiles in wearable antennas requires the characterization of their properties. Specific electrical conductive textiles are available on the market and have been successfully used. Ordinary textile fabrics have been used as substrates. However, little information can be found on the electromagnetic properties of regular textiles. Therefore this paper is mainly focused on the analysis of the dielectric properties of normal fabrics. In general, textiles present a very low dielectric constant that reduces the surface wave losses and increases the impedance bandwidth of the antenna. However, textile materials are constantly exchanging water molecules with the surroundings, which affects their electromagnetic properties. In addition, textile fabrics are porous, anisotropic and compressible materials whose thickness and density might change with low pressures. Therefore it is important to know how these characteristics influence the behaviour of the antenna in order to minimize unwanted effects. This paper presents a survey of the key points for the design and development of textile antennas, from the choice of the textile materials to the framing of the antenna. An analysis of the textile materials that have been used is also presented. PMID:23202235

  17. Trajectory Design for the Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel; Williams, Trevor; Mendelsohn, Chad

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission launching in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the SWM76 launch window tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements.

  18. Trajectory Design for the Transiting Exoplanet Survey Satellite

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel J. K.; Williams, Trevor W.; Mendelsohn, Chad R.

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission, scheduled to be launched in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the Schematics Window Methodology (SWM76) launch window analysis tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements. Keywords: resonant orbit, stability, lunar flyby, phasing loops, trajectory optimization

  19. Design Unit Consultant Data Set Expertise Survey Data Sets

    E-print Network

    Derisi, Joseph

    ,332 infant death certificates; 3,309 fetal death certificates National Ambulatory Medical Care Survey,000 physicians Ralph Gonzales National Hospital Ambulatory Medical Care Survey (NHAMCS) Annual representative) Health Care Access, Cost, and Use 1991present Medicare Patients National 12,000 National Home

  20. Quantitative assessment of climate change and human activities impact on the designed annual runoff

    NASA Astrophysics Data System (ADS)

    Hu, Yiming; Liang, Zhongmin; Liu, Yongwei

    2015-04-01

    In recent years, more and more researchers study the impact of climate change and human activities on runoff in the flood context. In this study, we propose a novel statistical method to quantitatively analyze the contribution of climate change and human activities to runoff change. The method is based on the assumption that if a given x-year designed precipitation is input to the hydrological model, the return period of corresponding output runoff also is x-year. The assumption has been widely used in the hydrological field when precipitation data is used to estimate the deigned flood with a given horizon. Compared to most of the current studies using the hydrological model to simulate the change, the proposed method needs less data, which makes it easy to implement. The method is employed to analyze the impact of climate change and human activities on different- designed-horizon annual runoff in the upper basin of Tangnaihai station. The M-K test result shows that the annual runoff series has the decreasing trend. The quantitative impact assessment results show that in terms of 1000-, 100-, and 50-year return period, the designed annual runoff after 1989 reduced by 24.8%, 24.6% and 24.2% respectively compared to that before 1989. The climate change accounts for 71.1%, 65.7% and 63.2% of the decrease of 1000-, 100-, and 50-year designed annual runoff respectively, while the human activities account for 28.9%, 34.3% and 36.8% respectively. Overall, the impact of climate change on annual runoff is higher than that of human activities. Keywords: annual runoff; climate change; human activities; impact assessment

  1. Professional values and reported behaviours of doctors in the USA and UK: quantitative survey

    PubMed Central

    Rao, Sowmya R; Sibbald, Bonnie; Hann, Mark; Harrison, Stephen; Walter, Alex; Guthrie, Bruce; Desroches, Catherine; Ferris, Timothy G; Campbell, Eric G

    2011-01-01

    Background The authors aimed to determine US and UK doctors' professional values and reported behaviours, and the extent to which these vary with the context of care. Method 1891 US and 1078 UK doctors completed the survey (64.4% and 40.3% response rate respectively). Multivariate logistic regression was used to compare responses to identical questions in the two surveys. Results UK doctors were more likely to have developed practice guidelines (82.8% UK vs 49.6% US, p<0.001) and to have taken part in a formal medical error-reduction programme (70.9% UK vs 55.7% US, p<0.001). US doctors were more likely to agree about the need for periodic recertification (completely agree 23.4% UK vs 53.9% US, p<0.001). Nearly a fifth of doctors had direct experience of an impaired or incompetent colleague in the previous 3?years. Where the doctor had not reported the colleague to relevant authorities, reasons included thinking that someone else was taking care of the problem, believing that nothing would happen as a result, or fear of retribution. UK doctors were more likely than US doctors to agree that significant medical errors should always be disclosed to patients. More US doctors reported that they had not disclosed an error to a patient because they were afraid of being sued. Discussion The context of care may influence both how professional values are expressed and the extent to which behaviours are in line with stated values. Doctors have an important responsibility to develop their healthcare systems in ways which will support good professional behaviour. PMID:21383386

  2. Quantitative optical imaging and sensing by joint design of point spread functions and estimation algorithms

    NASA Astrophysics Data System (ADS)

    Quirin, Sean Albert

    The joint application of tailored optical Point Spread Functions (PSF) and estimation methods is an important tool for designing quantitative imaging and sensing solutions. By enhancing the information transfer encoded by the optical waves into an image, matched post-processing algorithms are able to complete tasks with improved performance relative to conventional designs. In this thesis, new engineered PSF solutions with image processing algorithms are introduced and demonstrated for quantitative imaging using information-efficient signal processing tools and/or optical-efficient experimental implementations. The use of a 3D engineered PSF, the Double-Helix (DH-PSF), is applied as one solution for three-dimensional, super-resolution fluorescence microscopy. The DH-PSF is a tailored PSF which was engineered to have enhanced information transfer for the task of localizing point sources in three dimensions. Both an information- and optical-efficient implementation of the DH-PSF microscope are demonstrated here for the first time. This microscope is applied to image single-molecules and micro-tubules located within a biological sample. A joint imaging/axial-ranging modality is demonstrated for application to quantifying sources of extended transverse and axial extent. The proposed implementation has improved optical-efficiency relative to prior designs due to the use of serialized cycling through select engineered PSFs. This system is demonstrated for passive-ranging, extended Depth-of-Field imaging and digital refocusing of random objects under broadband illumination. Although the serialized engineered PSF solution is an improvement over prior designs for the joint imaging/passive-ranging modality, it requires the use of multiple PSFs---a potentially significant constraint. Therefore an alternative design is proposed, the Single-Helix PSF, where only one engineered PSF is necessary and the chromatic behavior of objects under broadband illumination provides the necessary information transfer. The matched estimation algorithms are introduced along with an optically-efficient experimental system to image and passively estimate the distance to a test object. An engineered PSF solution is proposed for improving the sensitivity of optical wave-front sensing using a Shack-Hartmann Wave-front Sensor (SHWFS). The performance limits of the classical SHWFS design are evaluated and the engineered PSF system design is demonstrated to enhance performance. This system is fabricated and the mechanism for additional information transfer is identified.

  3. An adaptive sampling method based on optimized sampling design for fishery-independent surveys with comparisons with conventional designs

    Microsoft Academic Search

    Yong LiuYong; Yong Chen; Jiahua Cheng; Jianjian Lu

    The adaptive cluster sampling method is widely applied in terrestrial systems; however, it is not suitable for fisheries surveys\\u000a because of the high cost of unlimited sampling in practice. An adaptive approach is often used in fisheries surveys to allocate\\u000a sampling effort, usually following a stratified random design. Development of an adaptive sampling method based on optimized\\u000a sampling design (this

  4. Survey Design Research: A Tool for Answering Nursing Research Questions.

    PubMed

    Siedlecki, Sandra L; Butler, Robert S; Burchill, Christian N

    2015-01-01

    The clinical nurse specialist is in a unique position to identify and study clinical problems in need of answers, but lack of time and resources may discourage nurses from conducting research. However, some research methods can be used by the clinical nurse specialist that are not time-intensive or cost prohibitive. The purpose of this article is to explain the utility of survey methodology for answering a number of nursing research questions. The article covers survey content, reliability and validity issues, sample size considerations, and methods of survey delivery. PMID:26053608

  5. ESTIMATING AMPHIBIAN OCCUPANCY RATES IN PONDS UNDER COMPLEX SURVEY DESIGNS

    EPA Science Inventory

    Monitoring the occurrence of specific amphibian species in ponds is one component of the US Geological Survey's Amphibian Monitoring and Research Initiative. Two collaborative studies were conducted in Olympic National Park and southeastern region of Oregon. The number of ponds...

  6. Psychology 815 Quantitative Research Design & Analysis in Psychology -Fall 2013 Instructor: Dr. Debby Kashy Office Hours: Arranged via email

    E-print Network

    Liu, Taosheng

    . Keppel, G., & Wickens, T. D. (2004) Design and Analysis: A Researcher's Handbook (4th Edition). Prentice with the statistical analysis of data from experimental and nonexperimental research. We will begin with Analysis1 Psychology 815 ­ Quantitative Research Design & Analysis in Psychology - Fall 2013 Instructor: Dr

  7. A survey of what customers want in a cell phone design

    Microsoft Academic Search

    Chen Ling; Wonil Hwang; Gavriel Salvendy

    2007-01-01

    The cell phone is an information appliance that has been widely used. It provides instant access to information and makes people more ‘connected’. The objective of our study is to investigate the relationship among the design features of the cell phone and identify the most important design features and design factors. In our survey study, we asked 1,006 college students

  8. SURVEY GUIDE SURVEY FUNDAMENTALS

    E-print Network

    Shapiro, Vadim

    SURVEY GUIDE 1 SURVEY FUNDAMENTALS A GUIDE TO DESIGNING AND IMPLEMENTING SURVEYS #12;S U R V E Y GU I D E OFFICE OF QUALITY IMPROVEMENT SURVEY FUNDAMENTALS This guide describes in non-technical terms the underlying principles of good survey design and implementation. Clear, simple explanations lead the reader

  9. Few Canadian hospitals qualify for "Baby Friendly" designation by promoting breast-feeding: survey.

    PubMed Central

    Dunlop, M

    1995-01-01

    Only five Canadian hospitals meet requirements for promoting breast-feeding as set out by the World Health Organization and qualify to receive the international "Baby Friendly" designation, a national survey has determined. Results from the survey of 523 hospitals were released at a Toronto conference. Images p89-a PMID:7804924

  10. National Comorbidity Survey Replication Adolescent Supplement (NCS-A): II. Overview and Design

    ERIC Educational Resources Information Center

    Kessler, Ronald C.; Avenevoli, Shelli; Costello, E. Jane; Green, Jennifer Greif; Gruber, Michael J.; Heeringa, Steven; Merikangas, Kathleen R.; Pennell, Beth-Ellen; Sampson, Nancy A.; Zaslavsky, Alan M.

    2009-01-01

    The national comorbidity survey that seeks to determine the prevalence and correlates of mental disorders among U.S. adolescents is based on a dual-frame design that includes 904 adolescents from a previous household survey and 9,244 adolescent students from a sample of 320 schools. Replacement schools for those that refuse to participate do not…

  11. Design, Fielding, and Analysis of School-Based Surveys on Health Behaviors in

    E-print Network

    Lewis, Robert Michael

    Design, Fielding, and Analysis of School-Based Surveys on Health results, parents know most about the after school clubs, cafeteria food programs, and the print, web, and school-based health promotions. Low-income and minority populations

  12. Incorporating Complex Sample Design Effects When Only Final Survey Weights are Available

    PubMed Central

    West, Brady T.; McCabe, Sean Esteban

    2012-01-01

    This article considers the situation that arises when a survey data producer has collected data from a sample with a complex design (possibly featuring stratification of the population, cluster sampling, and / or unequal probabilities of selection), and for various reasons only provides secondary analysts of those survey data with a final survey weight for each respondent and “average” design effects for survey estimates computed from the data. In general, these “average” design effects, presumably computed by the data producer in a way that fully accounts for all of the complex sampling features, already incorporate possible increases in sampling variance due to the use of the survey weights in estimation. The secondary analyst of the survey data who then 1) uses the provided information to compute weighted estimates, 2) computes design-based standard errors reflecting variance in the weights (using Taylor Series Linearization, for example), and 3) inflates the estimated variances using the “average” design effects provided is applying a “double” adjustment to the standard errors for the effect of weighting on the variance estimates, leading to overly conservative inferences. We propose a simple method for preventing this problem, and provide a Stata program for applying appropriate adjustments to variance estimates in this situation. We illustrate two applications of the method to survey data from the Monitoring the Future (MTF) study, and conclude with suggested directions for future research in this area. PMID:24596541

  13. Survey of North American Bicycle Commuters: Design and Aggregate Results

    Microsoft Academic Search

    WILLIAM E. MORITZ

    1997-01-01

    Although interest exists in promoting bicycle commuting to help meet air quality and commuter-trip reduction goals, there are virtually no data on bicycle commuters. A comprehensive survey, distributed over the Internet and by mail, of such commuters has been conducted, with 2,374 responses received from all regions of the United States and Canada. Information was gathered in seven categories: about

  14. A Survey on Network Coordinates Systems, Design, and Security

    E-print Network

    Castelluccia, Claude

    Coordinates Systems (NCS) have been proposed. An NCS allows hosts to predict latencies without performing, NCS opened new research fields in which the networking community has produced an impressive amount, we survey the various NCS proposed as well as their intrinsic limits. In particular, we focus

  15. ESTIMATING PROPORTION OF AREA OCCUPIED UNDER COMPLEX SURVEY DESIGNS

    EPA Science Inventory

    Estimating proportion of sites occupied, or proportion of area occupied (PAO) is a common problem in environmental studies. Typically, field surveys do not ensure that occupancy of a site is made with perfect detection. Maximum likelihood estimation of site occupancy rates when...

  16. Development of quantitative structure-activity relationships and its application in rational drug design.

    PubMed

    Yang, Guang-Fu; Huang, Xiaoqin

    2006-01-01

    Over forty years have elapsed since Hansch and Fujita published their pioneering work of quantitative structure-activity relationships (QSAR). Following the introduction of Comparative Molecular Field Analysis (CoMFA) by Cramer in 1998, other three-dimensional QSAR methods have been developed. Currently, combination of classical QSAR and other computational techniques at three-dimensional level is of greatest interest and generally used in the process of modern drug discovery and design. During the last several decades, a number of different mythologies incorporating a range of molecular descriptors and different statistical regression ways have been proposed and successfully applied in developing of new drugs, thus QSAR method has been proven to be indispensable in not only the reliable prediction of specific properties of new compounds, but also the help to elucidate the possible molecular mechanism of the receptor-ligand interactions. Here, we review the recent developments in QSAR and their applications in rational drug design, focusing on the reasonable selection of novel molecular descriptors and the construction of predictive QSAR models by the help of advanced computational techniques. PMID:17168765

  17. Analysis on quantitative relationship between design parameters of infrared remote sensor and NIIRS

    NASA Astrophysics Data System (ADS)

    Jin, Yingji; Bai, Honggang

    2011-08-01

    The National Imagery Interpretability Rating Scale (NIIRS) is a 10-level scale (0 to 9) of imagery interpretability, these criteria indicate the level of information that can be extracted from an image of a given interpretability, the lowest is Level 0 and the highest is Level 9. The General Image Quality Equation (GIQE) is a model that relates physical parameters of an imaging sensor to NIIRS rating of the sensors image products. The scale has become an important tool for defining image requirements, selecting and tasking imaging system. In this paper, we first introduce briefly NIIRS and GIQE, and make an initial analysis about the factors affecting perceived interpretability of imagery, such as ground sample distance(GSD), relative edge response(RER), height overshoot(H), noise gain(G), and signal-to-noise ratio(SNR). Then, the design parameters of infrared remote sensor and GSD, RER, H, SNR, and NIIRS scale quantitative relation is first determined, and the simulation curve of NIIRS scale versus R is presented. Finally, the analysis between NIIRS scale and GSD, RER, H and G/SNR shows that it is evident that the GSD and RER are the dominant terms in the equation, and that the overshoot H and the G/SNR have a much smaller impact. Comparisons with calculation results show that the research can provide preliminary theory evidence for the optimum design of remote-sensors although more validation experiments are needed.

  18. Cross-layer design: a survey and the road ahead

    Microsoft Academic Search

    Vineet Srivastava; Mehul Motani

    2005-01-01

    Of late, there has been an avalanche of cross-layer design proposals for wireless networks. A number of researchers have looked at specific aspects of network performance and, approaching cross-layer design via their interpretation of what it implies, have presented several cross-layer design proposals. These proposals involve different layers of the protocol stack, and address both cellular and ad hoc networks.

  19. Surveying current research in object-oriented design

    Microsoft Academic Search

    Rebecca J. Wirfs-Brock; Ralph E. Johnson

    1990-01-01

    The state of object-oriented is evolving rapidly. This survey describes what are currently thought to be the key ideas. Although it is necessarily incomplete, it contains both academic and industrial efforts and describes work in both the United States and Europe. It ignores well-known ideas, like that of Coad and Meyer [34], in favor of less widely known projects.Research in

  20. Probability of detection of nests and implications for survey design

    USGS Publications Warehouse

    Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.

    2009-01-01

    Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.

  1. Surveys and questionnaires in nursing research.

    PubMed

    Timmins, Fiona

    2015-06-17

    Surveys and questionnaires are often used in nursing research to elicit the views of large groups of people to develop the nursing knowledge base. This article provides an overview of survey and questionnaire use in nursing research, clarifies the place of the questionnaire as a data collection tool in quantitative research design and provides information and advice about best practice in the development of quantitative surveys and questionnaires. PMID:26080989

  2. Using GIS to generate spatially balanced random survey designs for natural resource applications.

    PubMed

    Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B

    2007-07-01

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design. PMID:17546523

  3. On standard and optimal designs of industrial-scale 2-D seismic surveys

    NASA Astrophysics Data System (ADS)

    Guest, T.; Curtis, A.

    2011-08-01

    The principal aim of performing a survey or experiment is to maximize the desired information within a data set by minimizing the post-survey uncertainty on the ranges of the model parameter values. Using Bayesian, non-linear, statistical experimental design (SED) methods we show how industrial scale amplitude variations with offset (AVO) surveys can be constructed to maximize the information content contained in AVO crossplots, the principal source of petrophysical information from seismic surveys. The design method allows offset dependent errors, previously not allowed in non-linear geoscientific SED methods. The method is applied to a single common-midpoint gather. The results show that the optimal design is highly dependent on the ranges of the model parameter values when a low number of receivers is being used, but that a single optimal design exists for the complete range of parameters once the number of receivers is increased above a threshold value. However, when acquisition and processing costs are considered we find that a design with constant spatial receiver separation survey becomes close to optimal. This explains why regularly-spaced, 2-D seismic surveys have performed so well historically, not only from the point of view of noise attenuation and imaging in which homogeneous data coverage confers distinct advantages, but also to provide data to constrain subsurface petrophysical information.

  4. A SURVEY ON OPTIMAL DESIGN OF CIVIL ENGINEERING STRUCTURAL SYSTEMS

    Microsoft Academic Search

    C. S. KRISHNAMOORTHY; D. R. MOSI

    1979-01-01

    This paper attempts to synthesize the works carried out in the field of optimization in civil engineering structural design and is intended to help practising engineers and researchers. The paper is broadly classified according to the various types of structural systems and suitable grouping is done within this framework. Promising optimization methods, approximation concepts and structural design strategies are identified

  5. The Effect of Questionnaire Cover Design in Mail Surveys

    Microsoft Academic Search

    Philip Gendall

    It has been suggested that the response rate for a self administered questionnaire will be enhanced if the cover of the questionnaire contains a picture, and, furthermore, that the more distinctive and complex the cover design created, the stronger this effect is likely to be. This paper reports the results of a study designed to test these hypotheses by comparing

  6. First International Workshop on Quantitative Stochastic Models in the Verification and Design of Software Systems (QUOVADIS 2010)

    Microsoft Academic Search

    Carlo Ghezzi; Lars Grunske; Raffaela Mirandola

    2010-01-01

    Nowadays requirements related to quality attributes such as performance, reliability, safety and security are often considered the most important requirements for software development projects. To reason about these quality attributes different stochastic models can be used. These models enable probabilistic verification as well as quantitative prediction at design time. On the other hand, these models could be also used to

  7. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  8. Sample and design considerations in post-disaster mental health needs assessment tracking surveys

    PubMed Central

    Kessler, Ronald C.; Keane, Terence M.; Ursano, Robert J.; Mokdad, Ali; Zaslavsky, Alan M.

    2009-01-01

    Although needs assessment surveys are carried out after many large natural and man-made disasters, synthesis of findings across these surveys and disaster situations about patterns and correlates of need is hampered by inconsistencies in study designs and measures. Recognizing this problem, the US Substance Abuse and Mental Health Services Administration (SAMHSA) assembled a task force in 2004 to develop a model study design and interview schedule for use in post-disaster needs assessment surveys. The US National Institute of Mental Health subsequently approved a plan to establish a center to implement post-disaster mental health needs assessment surveys in the future using an integrated series of measures and designs of the sort proposed by the SAMHSA task force. A wide range of measurement, design, and analysis issues will arise in developing this center. Given that the least widely discussed of these issues concerns study design, the current report focuses on the most important sampling and design issues proposed for this center based on our experiences with the SAMHSA task force, subsequent Katrina surveys, and earlier work in other disaster situations. PMID:19035440

  9. Electronic Survey Methodology: A Case Study in Reaching Hard-to-Involve Internet Users

    Microsoft Academic Search

    Dorine Andrews; Blair Nonnecke; Jennifer Preece

    2003-01-01

    Using the Internet to conduct quantitative research presents challenges not found in conventional research. Paper-based survey quality criteria cannot be completely adapted to electronic formats. Electronic surveys have distinctive technological, de- mographic, and response characteristics that affect their design, use, and implemen- tation. Survey design, participant privacy and confidentiality, sampling and subject solicitation, distribution methods and response rates, and survey

  10. A Survey of Hardware Accelerators Used in Computer-Aided Design

    Microsoft Academic Search

    Tom Blank

    1984-01-01

    Hardware accelerators, or special-purpose engines, have been used in computer-aided design applications for nearly 20 years. In this time, roughly 20 machines have been built and tested specifically for such purposes as simulation, design rule checking, placement, and routing. Their uses are increasing, and the machines are becoming commercially available. This survey describes not only the machines but also their

  11. Metamodels for Computer-based Engineering Design: Survey and recommendations

    Microsoft Academic Search

    Timothy W. Simpson; J. D. Poplinski; P. N. Koch; J. K. Allen

    2001-01-01

    Abstract. The use of statistical techniques,to build ap- proximations,of expensive computer,analysis codes pervades much,of today’s engineering,design. These statistical approxi- mations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and,concept,exploration. In this paper, we review several of these techniques, including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning and kriging.

  12. The US National Comorbidity Survey Replication (NCS-R): design and field procedures.

    PubMed

    Kessler, Ronald C; Berglund, Patricia; Chiu, Wai Tat; Demler, Olga; Heeringa, Steven; Hiripi, Eva; Jin, Robert; Pennell, Beth-Ellen; Walters, Ellen E; Zaslavsky, Alan; Zheng, Hui

    2004-01-01

    The National Comorbidity Survey Replication (NCS-R) is a survey of the prevalence and correlates of mental disorders in the US that was carried out between February 2001 and April 2003. Interviews were administered face-to-face in the homes of respondents, who were selected from a nationally representative multi-stage clustered area probability sample of households. A total of 9,282 interviews were completed in the main survey and an additional 554 short non-response interviews were completed with initial non-respondents. This paper describes the main features of the NCS-R design and field procedures, including information on fieldwork organization and procedures, sample design, weighting and considerations in the use of design-based versus model-based estimation. Empirical information is presented on non-response bias, design effect, and the trade-off between bias and efficiency in minimizing total mean-squared error of estimates by trimming weights. PMID:15297905

  13. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    USGS Publications Warehouse

    Edwards, T.C., Jr.; Cutler, D.R.; Zimmermann, N.E.; Geiser, L.; Moisen, G.G.

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by resubstitution rates were similar for each lichen species irrespective of the underlying sample survey form. Cross-validation estimates of prediction accuracies were lower than resubstitution accuracies for all species and both design types, and in all cases were closer to the true prediction accuracies based on the EVALUATION data set. We argue that greater emphasis should be placed on calculating and reporting cross-validation accuracy rates rather than simple resubstitution accuracy rates. Evaluation of the DESIGN and PURPOSIVE tree models on the EVALUATION data set shows significantly lower prediction accuracy for the PURPOSIVE tree models relative to the DESIGN models, indicating that non-probabilistic sample surveys may generate models with limited predictive capability. These differences were consistent across all four lichen species, with 11 of the 12 possible species and sample survey type comparisons having significantly lower accuracy rates. Some differences in accuracy were as large as 50%. The classification tree structures also differed considerably both among and within the modelled species, depending on the sample survey form. Overlap in the predictor variables selected by the DESIGN and PURPOSIVE tree models ranged from only 20% to 38%, indicating the classification trees fit the two evaluated survey forms on different sets of predictor variables. The magnitude of these differences in predictor variables throws doubt on ecological interpretation derived from prediction models based on non-probabilistic sample surveys. ?? 2006 Elsevier B.V. All rights reserved.

  14. Systematic assessment of survey scan and MS2-based abundance strategies for label-free quantitative proteomics using high-resolution MS data.

    PubMed

    Tu, Chengjian; Li, Jun; Sheng, Quanhu; Zhang, Ming; Qu, Jun

    2014-04-01

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R(2) > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery. PMID:24635752

  15. Systematic Assessment of Survey Scan and MS2-Based Abundance Strategies for Label-Free Quantitative Proteomics Using High-Resolution MS Data

    PubMed Central

    2015-01-01

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R2 > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery. PMID:24635752

  16. Design study of the deepsky ultraviolet survey telescope. [Spacelab payload

    NASA Technical Reports Server (NTRS)

    Page, N. A.; Callaghan, F. G.; Killen, R. H.; Willis, W.

    1977-01-01

    Preliminary mechanical design and specifications are presented for a wide field ultraviolet telescope and detector to be carried as a Spacelab payload. Topics discussed include support structure stiffness (torsional and bending), mirror assembly, thermal control, optical alignment, attachment to the instrument pointing pallet, control and display, power requirements, acceptance and qualification test plans, cost analysis and scheduling. Drawings are included.

  17. Designing Your Sample Efficiently: Clustering Effects in Education Surveys

    ERIC Educational Resources Information Center

    Hutchison, Dougal

    2009-01-01

    Background: Education, and information about education, is highly structured: individuals are grouped into classes, which are grouped into schools, which are grouped into local authorities, which are grouped into countries. The degree of similarity among members of a group, such as a school or classroom, is a very important factor in the design

  18. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for the detection of abrupt changes (such as failures) in stochastic dynamical systems were surveyed. The class of linear systems were emphasized, but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  19. HRMS sky survey wideband feed system design for DSS 24 beam waveguide antenna

    NASA Technical Reports Server (NTRS)

    Stanton, P. H.; Lee, P. R.; Reilly, H. F.

    1993-01-01

    The High-Resolution Microwave Survey (HRMS) Sky Survey project will be implemented on the DSS 24 beam waveguide (BWG) antenna over the frequency range of 2.86 to 10 GHz. Two wideband, ring-loaded, corrugated feed horns were designed to cover this range. The horns match the frequency-dependent gain requirements for the DSS 24 BWG system. The performance of the feed horns and the calculated system performance of DSS 24 are presented.

  20. The Visible and Infrared Survey Telescope for Astronomy (VISTA): Design, technical overview, and performance

    NASA Astrophysics Data System (ADS)

    Sutherland, Will; Emerson, Jim; Dalton, Gavin; Atad-Ettedgui, Eli; Beard, Steven; Bennett, Richard; Bezawada, Naidu; Born, Andrew; Caldwell, Martin; Clark, Paul; Craig, Simon; Henry, David; Jeffers, Paul; Little, Bryan; McPherson, Alistair; Murray, John; Stewart, Malcolm; Stobie, Brian; Terrett, David; Ward, Kim; Whalley, Martin; Woodhouse, Guy

    2015-03-01

    The Visible and Infrared Survey Telescope for Astronomy (VISTA) is the 4-m wide-field survey telescope at ESO's Paranal Observatory, equipped with the world's largest near-infrared imaging camera (VISTA IR Camera, VIRCAM), with 1.65 degree diameter field of view, and 67 Mpixels giving 0.6 deg2 active pixel area, operating at wavelengths 0.8-2.3 ?m. We provide a short history of the project, and an overview of the technical details of the full system including the optical design, mirrors, telescope structure, IR camera, active optics, enclosure and software. The system includes several innovative design features such as the f/1 primary mirror, thedichroic cold-baffle camera design and the sophisticated wavefront sensing system delivering closed-loop 5-axis alignment of the secondary mirror. We conclude with a summary of the delivered performance, and a short overview of the six ESO public surveys in progress on VISTA.

  1. Improved optical design for the Large Synoptic Survey Telescope (LSST)

    Microsoft Academic Search

    Lynn G. Seppala

    This paper presents an improved optical design for the LSST, an f\\/1.25 three-mirror telescope covering 3.0 degrees full field angle, with 6.9 m effective aperture diameter. The telescope operates at five wavelength bands spanning 386.5 nm to 1040 nm (B, V, R, I and Z). For all bands, 80% of the polychromatic diffracted energy is collected within 0.20 arc- seconds

  2. Improved Optical Design for the Large Synoptic Survey Telescope (LSST)

    SciTech Connect

    Seppala, L

    2002-09-24

    This paper presents an improved optical design for the LSST, an fll.25 three-mirror telescope covering 3.0 degrees full field angle, with 6.9 m effective aperture diameter. The telescope operates at five wavelength bands spanning 386.5 nm to 1040 nm (B, V, R, I and Z). For all bands, 80% of the polychromatic diffracted energy is collected within 0.20 arc-seconds diameter. The reflective telescope uses an 8.4 m f/1.06 concave primary, a 3.4 m convex secondary and a 5.2 m concave tertiary in a Paul geometry. The system length is 9.2 m. A refractive corrector near the detector uses three fused silica lenses, rather than the two lenses of previous designs. Earlier designs required that one element be a vacuum barrier, but now the detector sits in an inert gas at ambient pressure. The last lens is the gas barrier. Small adjustments lead to optimal correction at each band. The filters have different axial thicknesses. The primary and tertiary mirrors are repositioned for each wavelength band. The new optical design incorporates features to simplify manufacturing. They include a flat detector, a far less aspheric convex secondary (10 {micro}m from best fit sphere) and reduced aspheric departures on the lenses and tertiary mirror. Five aspheric surfaces, on all three mirrors and on two lenses, are used. The primary is nearly parabolic. The telescope is fully baffled so that no specularly reflected light from any field angle, inside or outside of the full field angle of 3.0 degrees, can reach the detector.

  3. Pre-PCR DNA quantitation of soil and sediment samples: method development and instrument design

    Microsoft Academic Search

    P. C. Stark; K. I. Mullen; K. Banton; R. Russotti; D. Soran; C. R. Kuske

    2000-01-01

    A simple and straightforward method for the quantitation of dsDNA in soil and sediment matrices has been developed to support rapid, in-the-field PCR analysis of environmental samples. This method uses PicoGreen nucleic acid stain, and a combination of UV\\/Vis and fluorescence spectroscopy, to quantitate dsDNA in the presence of interfering humic materials. The practical utility of this approach is that

  4. Sampling design for the 1980 commercial and multifamily residential building survey

    SciTech Connect

    Bowen, W.M.; Olsen, A.R.; Nieves, A.L.

    1981-06-01

    Details of a proposed sample design for the 1980 Commercial and Multifamily Building Energy Performance Survey are presented. The objective of the survey is to assess the extent to which new building design practices comply with the proposed 1980 Energy Budget Levels for Commercial and Multifamily Residential Building Designs (DEB/sub 80/). The procedure will be to: identify a small number of building types which account for the majority of commercial buildings constructed in the U.S.A.; conduct a separate survey for each building type; and include only buildings designed during 1980. For each building in the survey, the Design Energy Consumption (DEC/sub 80/) will be determined by the DOE2.1 computer program. The quantity X = (DEC/sub 80/ - DEB/sub 80/), will be calculated for each building as a measure of its compliance with DEB/sub 80/. These X quantities will then be used to compute sample statistics. Inferences about nationwide compliance with DEB/sub 80/ may then be made for each building type. This report provides details of the population, sampling frame, stratification, sample size, and implementation of the sampling plan.

  5. Optical Design Trade Study for the Wide Field Infrared Survey Telescope [WFIRST

    NASA Technical Reports Server (NTRS)

    Content, David A.; Goullioud, R.; Lehan, John P.; Mentzell, John E.

    2011-01-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission concept was ranked first in new space astrophysics mission by the Astro2010 Decadal Survey incorporating the Joint Dark Energy Mission (JDEM)-Omega payload concept and multiple science white papers. This mission is based on a space telescope at L2 studying exoplanets [via gravitational microlensing], probing dark energy, and surveying the near infrared sky. Since the release of NWNH, the WFIRST project has been working with the WFIRST science definition team (SDT) to refine mission and payload concepts. We present the driving requirements. The current interim reference mission point design, based on the use of a 1.3m unobscured aperture three mirror anastigmat form, with focal imaging and slitless spectroscopy science channels, is consistent with the requirements, requires no technology development, and out performs the JDEM-Omega design.

  6. Beyond surveys and focus groups: usability testing can help you hone your Web site design.

    PubMed

    2004-07-01

    Usability testing goes beyond surveys, Web trend analysis, and focus groups by providing direct observation of how people interact with a site. The process can be informal and inexpensive, involving a few participants and a Web designer. Or it can be more formal, using special technology and professional usability analysts. But even a relatively casual usability test can give Web designers information about the site that they can't get in any other way. PMID:15346969

  7. Wide Field Infrared Survey Telescope [WFIRST]: Telescope Design and Simulated Performance

    NASA Technical Reports Server (NTRS)

    Goullioud, R.; Content, D. A.; Kuan, G. M.; Moore, J. D.; Chang, Z.; Sunada, E. T.; Villalvazo, J.; Hawk, J. P.; Armani, N. V.; Johnson, E. L.; Powell, C. A.

    2012-01-01

    The ASTRO2010 Decadal Survey proposed multiple missions with NIR focal planes and 3 mirror wide field telescopes in the 1.5m aperture range. None of them would have won as standalone missions WFIRST is a combination of these missions, created by Astro 2010 committee. WFIRST Science Definition Team (SDT) tasked to examine the design. Project team is a GSFC-JPL-Caltech collaboration. This interim mission design is a result of combined work by the project team with the SDT.

  8. Optimization of electrical geophysical survey design for hydrogeological applications and subsurface target discrimination

    NASA Astrophysics Data System (ADS)

    Goode, Tomas Charles

    Geophysical imaging methods significantly enhance our knowledge of subsurface characteristics and their use has become prevalent over a range of subsurface investigations. These methods facilitate the detection and characterization of both metallic and nonmetallic subsurface targets, and can provide spatially extensive information on subsurface structure and characteristics that is often impractical to obtain using standard drilling and sampling procedures alone. Electrical imaging methods such as electrical resistivity tomography (ERT) have proven to be particularly useful in hydrogeologic and geotechnical investigations because of the strong dependence of the electrical properties of soils to water saturation, soil texture, and solute concentration. Given the available geophysical tools as well as their applications, the selection of the appropriate geophysical survey design is an essential part of every subsurface geophysical investigation. Where investigations are located in an area with subsurface information already available, this information may be used as a guide for the design of a geophysical survey. In some instances, no subsurface information is available and a survey must be designed to cover a range of possible circumstances. Yet, in other instances, there may be significant subsurface information available, but because of subsurface complexities, a geophysical survey must still be designed to cover a broad range of possibilities. Demonstrating the application and limitations of ERT in a specific field application, the first investigation presented in this document provides guidance for developing methods to improve the design and implementation of ERT surveys in a complex subsurface environment. The two investigations that follow present the development of a relatively simple optimization approach based on limited forward modeling of the geophysical response for both static and mobile surveys. This process is demonstrated through examples of selecting a limited number of ERT surveys to identify and discriminate subsurface target tunnels (with a simple cylindrical geometry). These examples provide insights into the practical application of the optimization process for improved ERT survey design for subsurface target detection. Because of their relative simplicity, the optimization procedures developed here may be used to rapidly identify optimal array configurations without the need for computationally expensive inversion techniques.

  9. Design and synthesis of target-responsive aptamer-cross-linked hydrogel for visual quantitative detection of ochratoxin A.

    PubMed

    Liu, Rudi; Huang, Yishun; Ma, Yanli; Jia, Shasha; Gao, Mingxuan; Li, Jiuxing; Zhang, Huimin; Xu, Dunming; Wu, Min; Chen, Yan; Zhu, Zhi; Yang, Chaoyong

    2015-04-01

    A target-responsive aptamer-cross-linked hydrogel was designed and synthesized for portable and visual quantitative detection of the toxin Ochratoxin A (OTA), which occurs in food and beverages. The hydrogel network forms by hybridization between one designed DNA strand containing the OTA aptamer and two complementary DNA strands grafting on linear polyacrylamide chains. Upon the introduction of OTA, the aptamer binds with OTA, leading to the dissociation of the hydrogel, followed by release of the preloaded gold nanoparticles (AuNPs), which can be observed by the naked eye. To enable sensitive visual and quantitative detection, we encapsulated Au@Pt core-shell nanoparticles (Au@PtNPs) in the hydrogel to generate quantitative readout in a volumetric bar-chart chip (V-Chip). In the V-Chip, Au@PtNPs catalyzes the oxidation of H2O2 to generate O2, which induces movement of an ink bar to a concentration-dependent distance for visual quantitative readout. Furthermore, to improve the detection limit in complex real samples, we introduced an immunoaffinity column (IAC) of OTA to enrich OTA from beer. After the enrichment, as low as 1.27 nM (0.51 ppb) OTA can be detected by the V-Chip, which satisfies the test requirement (2.0 ppb) by the European Commission. The integration of a target-responsive hydrogel with portable enrichment by IAC, as well as signal amplification and quantitative readout by a simple microfluidic device, offers a new method for portable detection of food safety hazard toxin OTA. PMID:25771715

  10. "Is This Ethical?" A Survey of Opinion on Principles and Practices of Document Design.

    ERIC Educational Resources Information Center

    Dragga, Sam

    1996-01-01

    Reprints a corrected version of an article originally published in the volume 43, number 1 issue of this journal. Presents results of a national survey of technical communicators and technical communication teachers assessing the ethics of seven document design cases involving manipulation of typography, illustrations, and photographs. Offers…

  11. RESEARCH VESSEL SURVEY DESIGN FOR MONITORING DOLPHIN ABUNDANCE IN THE EASTERN TROPICAL PACIFIC

    E-print Network

    RESEARCH VESSEL SURVEY DESIGN FOR MONITORING DOLPHIN ABUNDANCE IN THE EASTERN TROPICAL PACIFIC Service began conducting long-term research ship sur- veys to determine status ofspotted dolphin, Stenella attenuata. stocks in the eastern tropical Pacific. This is the main dolphin species taken incidentally

  12. Optimal Survey design using the point spread function measure of resolution Partha S. Routh

    E-print Network

    Oldenburg, Douglas W.

    with a given geometry and obtain a model that we denote as the primal inverse problem. For a linear problem in a region of interest. We pose survey design as an inverse problem by maxi- mizing a resolution measure is as delta-like as possible. This problem is solved as a nonlinear optimization problem with constraints

  13. Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys

    Cancer.gov

    Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys Cristine Delnevo, PhD, MPH UMDNJ-School of Public Health Why is methods research in Tobacco Surveillance important? z Measuring individual behavior over time is crucial

  14. ASSESSING THE ECOLOGICAL CONDITION OF A COASTAL PLAIN WATERSHED USING A PROBABILISTIC SURVEY DESIGN

    EPA Science Inventory

    Using a probabilistic survey design, we assessed the ecological condition of the Florida (USA) portion of the Escambia River watershed using selected environmental and benthic macroinvertebrate data. Macroinvertebrates were sampled at 28 sites during July-August 1996, and 3414 i...

  15. Designing questionnaires: healthcare survey to compare two different response scales

    PubMed Central

    2014-01-01

    Background A widely discussed design issue in patient satisfaction questionnaires is the optimal length and labelling of the answering scale. The aim of the present study was to compare intra-individually the answers on two response scales to five general questions evaluating patients’ perception of hospital care. Methods Between November 2011 and January 2012, all in-hospital patients at a Swiss University Hospital received a patient satisfaction questionnaire on an adjectival scale with three to four labelled categories (LS) and five redundant questions displayed on an 11-point end-anchored numeric scale (NS). The scales were compared concerning ceiling effect, internal consistency (Cronbach’s alpha), individual item answers (Spearman’s rank correlation), and concerning overall satisfaction by calculating an overall percentage score (sum of all answers related to the maximum possible sum). Results The response rate was 41% (2957/7158), of which 2400 (81%) completely filled out all questions. Baseline characteristics of the responders and non-responders were similar. Floor and ceiling effect were high on both response scales, but more pronounced on the LS than on the NS. Cronbach’s alpha was higher on the NS than on the LS. There was a strong individual item correlation between both answering scales in questions regarding the intent to return, quality of treatment and the judgement whether the patient was treated with respect and dignity, but a lower correlation concerning satisfactory information transfer by physicians or nurses, where only three categories were available in the LS. The overall percentage score showed a comparable distribution, but with a wider spread of lower satisfaction in the NS. Conclusions Since the longer scale did not substantially reduce the ceiling effect, the type of questions rather than the type of answering scale could be addressed with a focus on specific questions about concrete situations instead of general questions. Moreover, the low correlation in questions about information provision suggests that only three possible response choices are insufficient. Further investigations are needed to find a more sensitive scale discriminating high-end ratings. Otherwise, a longitudinal within-hospital or a cross-sectional between-hospital comparison of patient care is questionable. PMID:25086869

  16. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    PubMed Central

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis. PMID:26125967

  17. Designers' experiences of design methods and tools

    Microsoft Academic Search

    M. Lindahl

    2004-01-01

    This paper is based on an Internet-based questionnaire survey with both qualitative and quantitative questions was selected as the research method in order to collect data about the designer's experience of their used design methods and tools. The result is that the general formal follow-up and reflection on used design methods and tools is experienced as low and this implies

  18. Biological inventory for conservation evaluation I. Design of a field survey for diurnal, terrestrial birds in southern Australia

    Microsoft Academic Search

    H. M. Neave; T. W. Norton; H. A. Nix

    1996-01-01

    A systematic, stratified, regional biological survey was designed and implemented to sample the variation in bird and vegetation assemblages in an area of some 758 129 hectares of open Eucalyptus forest in south east Australia. The survey design was based on spatial estimates of abiotic attributes (terrain, climate and nutrient-supply potential of the substrate) and vegetation cover for a 9

  19. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  20. Design and field procedures in the US National Comorbidity Survey Replication Adolescent Supplement (NCS-A).

    PubMed

    Kessler, Ronald C; Avenevoli, Shelli; Costello, E Jane; Green, Jennifer Greif; Gruber, Michael J; Heeringa, Steven; Merikangas, Kathleen R; Pennell, Beth-Ellen; Sampson, Nancy A; Zaslavsky, Alan M

    2009-06-01

    An overview is presented of the design and field procedures of the US National Comorbidity Survey Replication Adolescent Supplement (NCS-A), a US face-to-face household survey of the prevalence and correlates of DSM-IV mental disorders. The survey was based on a dual-frame design that included 904 adolescent residents of the households that participated in the US National Comorbidity Survey Replication (85.9% response rate) and 9244 adolescent students selected from a nationally representative sample of 320 schools (74.7% response rate). After expositing the logic of dual-frame designs, comparisons are presented of sample and population distributions on Census socio-demographic variables and, in the school sample, school characteristics. These document only minor differences between the samples and the population. The results of statistical analysis of the bias-efficiency trade-off in weight trimming are then presented. These show that modest trimming meaningfully reduces mean squared error. Analysis of comparative sample efficiency shows that the household sample is more efficient than the school sample, leading to the household sample getting a higher weight relative to its size in the consolidated sample relative to the school sample. Taken together, these results show that the NCS-A is an efficient sample of the target population with good representativeness on a range of socio-demographic and geographic variables. PMID:19507169

  1. Estimation of wildlife population ratios incorporating survey design and visibility bias

    USGS Publications Warehouse

    Samuel, M.D.; Steinhorst, R.K.; Garton, E.O.; Unsworth, J.W.

    1992-01-01

    Age and sex ratio statistics are often a key component of the evaluation and management of wildlife populations. These statistics are determined from counts of animals that are commonly plagued by errors associated with either survey design or visibility bias. We present age and sex ratio estimators that incorporate both these sources of error and include the typical situation that animals are sampled in groups. Aerial surveys of elk (Cervus elaphus) in northcentral Idaho illustrate that differential visibility of age or sex classes can produce biased ratio estimates. Visibility models may be used to provide corrected estimates of ratios and their variability that incorporates errors due to sampling, visibility bias, and visibility estimation.

  2. Median and quantile tests under complex survey design using SAS and R.

    PubMed

    Pan, Yi; Caudill, Samuel P; Li, Ruosha; Caldwell, Kathleen L

    2014-11-01

    Techniques for conducting hypothesis testing on the median and other quantiles of two or more subgroups under complex survey design are limited. In this paper, we introduce programs in both SAS and R to perform such a test. A detailed illustration of the computations, macro variable definitions, input and output for the SAS and R programs are also included in the text. Urinary iodine data from National Health and Nutrition Examination Survey (NHANES) are used as examples for comparing medians between females and males as well as comparing the 75th percentiles among three salt consumption groups. PMID:25123100

  3. Composite Interval Mapping Based on Lattice Design for Error Control May Increase Power of Quantitative Trait Locus Detection

    PubMed Central

    Huang, Zhongwen; Zhao, Tuanjie; Xing, Guangnan; Gai, Junyi; Guan, Rongzhan

    2015-01-01

    Experimental error control is very important in quantitative trait locus (QTL) mapping. Although numerous statistical methods have been developed for QTL mapping, a QTL detection model based on an appropriate experimental design that emphasizes error control has not been developed. Lattice design is very suitable for experiments with large sample sizes, which is usually required for accurate mapping of quantitative traits. However, the lack of a QTL mapping method based on lattice design dictates that the arithmetic mean or adjusted mean of each line of observations in the lattice design had to be used as a response variable, resulting in low QTL detection power. As an improvement, we developed a QTL mapping method termed composite interval mapping based on lattice design (CIMLD). In the lattice design, experimental errors are decomposed into random errors and block-within-replication errors. Four levels of block-within-replication errors were simulated to show the power of QTL detection under different error controls. The simulation results showed that the arithmetic mean method, which is equivalent to a method under random complete block design (RCBD), was very sensitive to the size of the block variance and with the increase of block variance, the power of QTL detection decreased from 51.3% to 9.4%. In contrast to the RCBD method, the power of CIMLD and the adjusted mean method did not change for different block variances. The CIMLD method showed 1.2- to 7.6-fold higher power of QTL detection than the arithmetic or adjusted mean methods. Our proposed method was applied to real soybean (Glycine max) data as an example and 10 QTLs for biomass were identified that explained 65.87% of the phenotypic variation, while only three and two QTLs were identified by arithmetic and adjusted mean methods, respectively. PMID:26076140

  4. Bringing policy into space systems conceptual design : qualitative and quantitative methods

    E-print Network

    Weigel, Annalisa L. (Annalisa Lynn), 1972-

    2002-01-01

    A change in government policy can send waves of crippling impacts through the design and development of publicly funded complex engineering systems. Thus it is important for system architects and designers to understand ...

  5. Screen Design Guidelines for Motivation in Interactive Multimedia Instruction: A Survey and Framework for Designers.

    ERIC Educational Resources Information Center

    Lee, Sung Heum; Boling, Elizabeth

    1999-01-01

    Identifies guidelines from the literature relating to screen design and design of interactive instructional materials. Describes two types of guidelines--those aimed at enhancing motivation and those aimed at preventing loss of motivation--for typography, graphics, color, and animation and audio. Proposes a framework for considering motivation in…

  6. The NOAO Deep Wide-Field Survey: Design and Initial Results

    NASA Astrophysics Data System (ADS)

    Jannuzi, B. T.; Dey, A.; Brown, M. J. I.; Tiede, G. P.; NDWFS Team

    2002-12-01

    The NOAO Deep Wide-Field Survey (NDWFS) is a very deep optical and IR (BWRIJHK) imaging survey of 18 square degrees of the sky with the primary goal of studying the evolution of large-scale structure from z 1-4. The survey enables investigation of the formation and evolution of galaxies and the detection of luminous, very distant (z>4), star-forming galaxies and quasars. The images are also being used for weak-lensing studies and to provide information on the optical/IR counterparts to sources detected at other wavelengths. The extensive multi-wavelength observations targeting the NDWFS fields include observations with Chandra (x-rays), GALEX (UV), SIRTF (near, mid, and far IR), the VLA, and Westerbork (radio). I will review the design of the survey, the status of observations (nearing completion with 90% of the data obtained), and initial scientific results (e.g., evolution of clustering of red galaxies and EROs, see also contribution by M. Brown et al. at this meeting; IR properties of FIRST Survey detected radio galaxies in the NDWFS, see contribution by Henderson et al. this meeting). Our research is supported by the National Optical Astronomy Observatory which is operated by the Association of Universities for Research in Astronomy, Inc. (AURA) under cooperative agreement with the National Science Foundation.

  7. Survey and design of pictographic sign messages used in the hotel/motel industry 

    E-print Network

    Nichols, Kenneth Aaron

    1992-01-01

    ) 7 /' Rodger J. Kop (Member) Wa L. John (Member) ary L. ogg (Head of Department) May 1992 Survey and Design of Pictographic Sign Messages Used in the Hotel/Motel Industry. (May 1992) Kenneth Aaron Nichols, BBA, HardinSimmons University... Chair of Advisory Committee: Dr. R. Dale Huchingson Individual hotels and motels use different pictographs to convey the services and facilities they offer. There is no known data that evaluates the effectiveness of these pictographs. Several...

  8. Some New Three Level Designs for the Study of Quantitative Variables

    Microsoft Academic Search

    G. E. P. Box; D. W. Behnken

    1960-01-01

    A class of incomplete three level factorial designs useful for estimating the coefficients in a second degree graduating polynomial are described. The designs either meet, or approximately meet, the criterion of rotatability and for the most part can be orthogonally blocked. A fully worked example is included.

  9. Evaluation of a portable x-ray fluorescence survey meter for the quantitative determination of trace metals in welding fumes 

    E-print Network

    Fehrenbacher, Mary Catherine

    1984-01-01

    wet-ashing the membrane filter containing the fume followed by atomic absorption spectrophotometry. Although this method is very reliable, it is both time-consuming and destructive in nature. In contrast, x-ray fluorescence spectrometry is a very.... Previous research has shown x-ray fluorescence spectrometry to have the sensitivity necessary for industrial hygiene applications. A portable x-ray fluorescent survey meter was evaluated for analytical performance in a laboratory environment and as a...

  10. Loop Shaping Control Design for a Supersonic Propulsion System Model Using Quantitative Feedback Theory (QFT) Specifications and Bounds

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Kopasakis, George

    2010-01-01

    This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.

  11. The Proteome of Human Liver Peroxisomes: Identification of Five New Peroxisomal Constituents by a Label-Free Quantitative Proteomics Survey

    PubMed Central

    Ofman, Rob; Bunse, Christian; Pawlas, Magdalena; Hayen, Heiko; Eisenacher, Martin; Stephan, Christian; Meyer, Helmut E.; Waterham, Hans R.; Erdmann, Ralf; Wanders, Ronald J.; Warscheid, Bettina

    2013-01-01

    The peroxisome is a key organelle of low abundance that fulfils various functions essential for human cell metabolism. Severe genetic diseases in humans are caused by defects in peroxisome biogenesis or deficiencies in the function of single peroxisomal proteins. To improve our knowledge of this important cellular structure, we studied for the first time human liver peroxisomes by quantitative proteomics. Peroxisomes were isolated by differential and Nycodenz density gradient centrifugation. A label-free quantitative study of 314 proteins across the density gradient was accomplished using high resolution mass spectrometry. By pairing statistical data evaluation, cDNA cloning and in vivo colocalization studies, we report the association of five new proteins with human liver peroxisomes. Among these, isochorismatase domain containing 1 protein points to the existence of a new metabolic pathway and hydroxysteroid dehydrogenase like 2 protein is likely involved in the transport or ?-oxidation of fatty acids in human peroxisomes. The detection of alcohol dehydrogenase 1A suggests the presence of an alternative alcohol-oxidizing system in hepatic peroxisomes. In addition, lactate dehydrogenase A and malate dehydrogenase 1 partially associate with human liver peroxisomes and enzyme activity profiles support the idea that NAD+ becomes regenerated during fatty acid ?-oxidation by alternative shuttling processes in human peroxisomes involving lactate dehydrogenase and/or malate dehydrogenase. Taken together, our data represent a valuable resource for future studies of peroxisome biochemistry that will advance research of human peroxisomes in health and disease. PMID:23460848

  12. Targeting spatiotemporal dynamics of planktonic SAGMGC-1 and segregation of ammonia-oxidizing thaumarchaeota ecotypes by newly designed primers and quantitative polymerase chain reaction.

    PubMed

    Restrepo-Ortiz, Claudia X; Auguet, Jean-Christophe; Casamayor, Emilio O

    2014-03-01

    The annual dynamics of three different ammonia-oxidizing archaea (AOA) ecotypes (amoA gene) and of the SAGMGC-1 (Nitrosotalea-like aquatic Thaumarchaeota) group (16S rRNA gene) were studied by newly designed specific primers and quantitative polymerase chain reaction analysis in a deep oligotrophic high mountain lake (Lake Redon, Limnological Observatory of the Pyrenees, Spain). We observed segregated distributions of the main AOA populations, peaking separately in time and space, and under different ammonia concentrations and irradiance conditions. Strong positive correlation in gene abundances was found along the annual survey between 16S rRNA SAGMAGC-1 and one of the amoA ecotypes suggesting the potential for ammonia oxidation in the freshwater SAGMAGC-1 clade. We also observed dominance of Nitrosotalea-like ecotypes over Nitrosopumilus-like (Marine Group 1.1a) and not the same annual dynamics for the two thaumarchaeotal clades. The fine scale segregation in space and time of the different AOA ecotypes indicated the presence of phylogenetically close but ecologically segregated AOA species specifically adapted to specific environmental conditions. It remains to be elucidated what would be such environmental drivers. PMID:23848190

  13. A quantitative methodology for mapping project costs to engineering decisions in naval ship design and procurement

    E-print Network

    Netemeyer, Kristopher David

    2010-01-01

    Alternative methods for cost estimation are important in the early conceptual stages of a design when there is not enough detail to allow for a traditional quantity takeoff estimate to be performed. Much of the budgeting ...

  14. Multiwavelength CO2 differential-absorption lidar (DIAL) system designed for quantitative concentration measurement

    Microsoft Academic Search

    Joseph Leonelli; Jan van der Laan; Peter Holland; Leland Fletcher; Russell Warren

    1990-01-01

    A multiwavelength CO2 direct-detection DIAL system has been designed and developed to produce range-resolved vapor concentration contour plots of a 1 x 1 km grid at 20-m spatial resolution in 10 s intervals.

  15. A quantitative method for groundwater surveillance monitoring network design at the Hanford Site

    SciTech Connect

    Meyer, P.D.

    1993-12-01

    As part of the Environmental Surveillance Program at the Hanford Site, mandated by the US Department of Energy, hundreds of groundwater wells are sampled each year, with each sample typically analyzed for a variety of constituents. The groundwater sampling program must satisfy several broad objectives. These objectives include an integrated assessment of the condition of groundwater and the identification and quantification of existing, emerging, or potential groundwater problems. Several quantitative network desip objectives are proposed and a mathematical optimization model is developed from these objectives. The model attempts to find minimum cost network alternatives that maximize the amount of information generated by the network. Information is measured both by the rats of change with respect to time of the contaminant concentration and the uncertainty in contaminant concentration. In an application to tritium monitoring at the Hanford Site, both information measures were derived from historical data using time series analysis.

  16. Design and methods of the Adult Inuit Health Survey 2007–2008

    PubMed Central

    Saudny, Helga; Leggee, Donna; Egeland, Grace

    2012-01-01

    Background The Canadian International Polar Year (IPY) program made it possible to undertake much needed health research in 3 jurisdictions within the Canadian Inuit Nunangat (homeland) over a 2-year period: Inuvialuit Settlement Region (ISR), Nunavut Territory, and Nunatsiavut. Design The Adult Inuit Health Survey (IHS) was a cross-sectional survey and provides baseline data upon which future comparisons can be made for prospectively assessing factors leading to the progression of chronic diseases among Canadian Inuit. With the help of the Canadian Coast Guard Ship Amundsen, which was equipped with research and laboratory facilities, 33 coastal communities were visited; land survey teams visited 3 inland communities. Results The Adult IHS succeeded in obtaining important baseline information concerning the health status and living conditions of 2,595 adults living in ISR, Nunavut and Nunatsiavut. Conclusion Information from this survey will be useful for future comparisons and the opportunity to link with the International Inuit Cohort, a follow-up evaluation, and for the development of future health policies and public health interventions. PMID:23166895

  17. Virtual fashion and avatar design: a survey of consumers and designers

    Microsoft Academic Search

    Jeffrey Bardzell; Tyler Pace; Jennifer Terrell

    2010-01-01

    As virtual worlds evolve, so does the visual language of avatars inside them. In Second Life, an emergent virtual fashion industry supports amateur fashion\\/avatar design. This fashion industry includes both emergent (i.e., user-created) social institutions as well as a network of technologies, including Second Life's virtual environment itself, which support a sophisticated fusion of technical and cultural practices. This paper

  18. STATISTICAL BASIS FOR THE DESIGN AND INTERPRETATION OF THE NATIONAL SURFACE WATER SURVEY. PHASE 1. LAKES AND STREAMS

    EPA Science Inventory

    The primary objectives of Phase I of the National Surface Water Survey were to determine the number of acidic or potentially acidic lakes and streams, their location, and their physical and chemical characteristics. To meet these objectives, a statistically designed survey was im...

  19. Design Effects and Generalized Variance Functions for the 1990-91 Schools and Staffing Survey (SASS). Volume II. Technical Report.

    ERIC Educational Resources Information Center

    Salvucci, Sameena; And Others

    This technical report provides the results of a study on the calculation and use of generalized variance functions (GVFs) and design effects for the 1990-91 Schools and Staffing Survey (SASS). The SASS is a periodic integrated system of sample surveys conducted by the National Center for Education Statistics (NCES) that produces sampling variances…

  20. Improving the design of acoustic and midwater trawl surveys through stratification, with an application to Lake Michigan prey fishes

    USGS Publications Warehouse

    Adams, J.V.; Argyle, R.L.; Fleischer, G.W.; Curtis, G.L.; Stickel, R.G.

    2006-01-01

    Reliable estimates of fish biomass are vital to the management of aquatic ecosystems and their associated fisheries. Acoustic and midwater trawl surveys are an efficient sampling method for estimating fish biomass in large bodies of water. To improve the precision of biomass estimates from combined acoustic and midwater trawl surveys, sampling effort should be optimally allocated within each stage of the survey design. Based on information collected during fish surveys, we developed an approach to improve the design of combined acoustic and midwater trawl surveys through stratification. Geographic strata for acoustic surveying and depth strata for midwater trawling were defined using neighbor-restricted cluster analysis, and the optimal allocation of sampling effort for each was then determined. As an example, we applied this survey stratification approach to data from lakewide acoustic and midwater trawl surveys of Lake Michigan prey fishes. Precision of biomass estimates from surveys with and without geographic stratification was compared through resampling. Use of geographic stratification with optimal sampling allocation reduced the variance of Lake Michigan acoustic biomass estimates by 77%. Stratification and optimal allocation at each stage of an acoustic and midwater trawl survey should serve to reduce the variance of the resulting biomass estimates.

  1. "Intelligent design" of a 3D reflection survey for the SAFOD drill-hole site

    NASA Astrophysics Data System (ADS)

    Alvarez, G.; Hole, J. A.; Klemperer, S. L.; Biondi, B.; Imhof, M.

    2003-12-01

    SAFOD seeks to better understand the earthquake process by drilling though the San Andreas fault (SAF) to sample an earthquake in situ. To capitalize fully on the opportunities presented by the 1D drill-hole into a complex fault zone we must characterize the surrounding 3D geology at a scale commensurate with the drilling observations, to provide the structural context to extrapolate 1D drilling results along the fault plane and into the surrounding 3D volume. Excellent active-2D and passive-3D seismic observations completed and underway lack the detailed 3D resolution required. Only an industry-quality 3D reflection survey can provide c. 25 m subsurface sample-spacing horizontally and vertically. A 3D reflection survey will provide subsurface structural and stratigraphic control at the 100-m level, mapping major geologic units, structural boundaries, and subsurface relationships between the many faults that make up the SAF fault system. A principal objective should be a reflection-image (horizon-slice through the 3D volume) of the near-vertical fault plane(s) to show variations in physical properties around the drill-hole. Without a 3D reflection image of the fault zone, we risk interpreting drilled anomalies as ubiquitous properties of the fault, or risk missing important anomalies altogether. Such a survey cannot be properly costed or technically designed without major planning. "Intelligent survey design" can minimize source and receiver effort without compromising data-quality at the fault target. Such optimization can in principal reduce the cost of a 3D seismic survey by a factor of two or three, utilizing the known surface logistic constraints, partially-known sub-surface velocity field, and the suite of scientific targets at SAFOD. Our methodology poses the selection of the survey parameters as an optimization process that allows the parameters to vary spatially in response to changes in the subsurface. The acquisition geometry is locally optimized for uniformity of subsurface illumination by a micro-genetic algorithm. We start by accurately establishing the correspondence between the subsurface area of the target reflector (in this case, the steeply-dipping SAF) and the part of the surface area whose sources and receivers contribute to its image using 3D ray-tracing. We then use dense acquisition parameters in that part of the survey area and use standard parameters in the rest of the survey area. This is the key idea that allows us to get optimum image quality with the least acquisition effort. The optimization also requires constraints from structural geologists and from the community who will interpret the results. The most critical parameters to our optimization process are the structural model of the target(s) (depth and geological dips) and the velocity model in the subsurface. We seek community input, and have formed a scientific advisory committee of academic and industry leaders, to help evaluate trade-offs for the community between cost, resolution and volume of the resultant data-set, and to ensure that an appropriate range of piggy-back experiments is developed to utilize the seismic sources available during the 3D experiment. The scientific output of our project will be a community-vetted design for a 3D reflection survey over SAFOD that is technically feasible, cost-effective, and most likely to yield the image and seismic parameter measurements that will best constrain the physical properties of the fault zone and their spatial variation.

  2. KUIPER BELT OBJECT OCCULTATIONS: EXPECTED RATES, FALSE POSITIVES, AND SURVEY DESIGN

    SciTech Connect

    Bickerton, S. J. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Welch, D. L. [Department of Physics and Astronomy, McMaster University, Hamilton, ON L8S 4M1 (Canada); Kavelaars, J. J. [Herzberg Institute of Astrophysics, Victoria, BC V9E 2E7 (Canada)], E-mail: bick@astro.princeton.edu, E-mail: welch@physics.mcmaster.ca, E-mail: JJ.Kavelaars@nrc-cnrc.gc.ca

    2009-05-15

    A novel method of generating artificial scintillation noise is developed and used to evaluate occultation rates and false positive rates for surveys probing the Kuiper Belt with the method of serendipitous stellar occultations. A thorough examination of survey design shows that (1) diffraction-dominated occultations are critically (Nyquist) sampled at a rate of 2 Fsu{sup -1}, corresponding to 40 s{sup -1} for objects at 40 AU, (2) occultation detection rates are maximized when targets are observed at solar opposition, (3) Main Belt asteroids will produce occultations light curves identical to those of Kuiper Belt Objects (KBOs) if target stars are observed at solar elongations of: 116{sup 0} {approx}< {epsilon} {approx}< 125 deg., or 131 deg. {approx}< {epsilon} {approx}< 141 deg., and (4) genuine KBO occultations are likely to be so rare that a detection threshold of {approx}>7-8{sigma} should be adopted to ensure that viable candidate events can be disentangled from false positives.

  3. Implementing the World Mental Health Survey Initiative in Portugal – rationale, design and fieldwork procedures

    PubMed Central

    2013-01-01

    Background The World Mental Health Survey Initiative was designed to evaluate the prevalence, the correlates, the impact and the treatment patterns of mental disorders. This paper describes the rationale and the methodological details regarding the implementation of the survey in Portugal, a country that still lacks representative epidemiological data about psychiatric disorders. Methods The World Mental Health Survey is a cross-sectional study with a representative sample of the Portuguese population, aged 18 or older, based on official census information. The WMH-Composite International Diagnostic Interview, adapted to the Portuguese language by a group of bilingual experts, was used to evaluate the mental health status, disorder severity, impairment, use of services and treatment. Interviews were administered face-to-face at respondent’s dwellings, which were selected from a nationally representative multi-stage clustered area probability sample of households. The survey was administered using computer-assisted personal interview methods by trained lay interviewers. Data quality was strictly controlled in order to ensure the reliability and validity of the collected information. Results A total of 3,849 people completed the main survey, with 2,060 completing the long interview, with a response rate of 57.3%. Data cleaning was conducted in collaboration with the WMHSI Data Analysis Coordination Centre at the Department of Health Care Policy, Harvard Medical School. Collected information will provide lifetime and 12-month mental disorders diagnoses, according to the International Classification of Diseases and to the Diagnostic and Statistical Manual of Mental Disorders. Conclusions The findings of this study could have a major influence in mental health care policy planning efforts over the next years, specially in a country that still has a significant level of unmet needs regarding mental health services organization, delivery of care and epidemiological research. PMID:23837605

  4. Research Design Decisions: An Integrated Quantitative and Qualitative Model for Decision-Making Researchers (You Too Can Be Lord of the Rings).

    ERIC Educational Resources Information Center

    Geroy, Gary D.; Wright, Phillip C.

    1997-01-01

    Presents a concentric research design model based on need for research which transcends individuals' historic or experiential bias concerning choice of study design, tools, and data reduction strategies. Describes the following "rings": theory/knowledge orientation; theory versus applied research; quantitative versus qualitative research…

  5. EEL 5764 Computer Architecture 1. Catalog Description (3 credits) Fundamentals in design and quantitative analysis of

    E-print Network

    Fang, Yuguang "Michael"

    . Instructor ­ Dr. Tao Li a. Office location: 339D Larsen Hall b. Telephone: 352-392-9510 c. E-mail address and Supply Fees - None #12;12. Textbooks and Software Required - a. Title: Computer Architecture. ISBN number: 0-07-057064-7 14. Course Outline ­ Fundamentals of Computer Design Performance, Power

  6. Injury survey of a non-traditional 'soft-edged' trampoline designed to lower equipment hazards.

    PubMed

    Eager, David B; Scarrott, Carl; Nixon, Jim; Alexander, Keith

    2013-01-01

    In Australia trampolines contribute one quarter of all childhood play equipment injuries. The objective of this study was to gather and evaluate injury data from a non-traditional, 'soft-edged', consumer trampoline, where the design aimed to minimise injuries from the equipment and from falling off. The manufacturer of the non-traditional trampoline provided the University of Technology Sydney with their Australian customer database. The study involved surveys in Queensland and New South Wales, between May 2007 and March 2010. Initially injury data was gathered by a phone interview pilot study, then in the full study, through an email survey. The 3817 respondents were the carers of child users of the 'soft-edge' trampolines. Responses were compared with Australian and US emergency department data. In both countries the proportion of injuries caused by the equipment and falling off was compared with the proportion caused by the jumpers to themselves or each other. The comparisons showed a significantly lower proportion resulted from falling-off or hitting the equipment for this design when compared to traditional trampolines, both in Australia and the US. This research concludes that equipment-induced and falling-off injuries, the more severe injuries on traditional trampolines, can be significantly reduced with appropriate trampoline design. PMID:22471672

  7. Elderly Nutrition and Health Survey in Taiwan (1999-2000): research design, methodology and content.

    PubMed

    Pan, Wen-Harn; Hung, Yung-Tai; Shaw, Ning-Sing; Lin, Wei; Lee, Shyh-Dye; Chiu, Cheng-Fen; Lin, Meng-Chiao; Chen, Ssu-Yuan; Hong, Chi-Min; Huang, Teng-Yuan; Chang, Hsing-Yi; Tu, Su-hao; Chang, Ya-Hui; Yeh, Wen-Ting; Su, Shu-Chen

    2005-01-01

    The purpose of the Elderly Nutrition and Health Survey in Taiwan (1999-2000) was to assess the diet, nutrition and health of persons aged 65 and above in Taiwan. A multi-staged, stratified, clustered probability sampling scheme was used in the survey. The survey population was stratified into a total of 13 strata. The four strata of "Hakka areas", "Mountain areas", " Eastern areas", and "PengHu islands" were unique in their ethnicity or geographic locations. The remaining areas of Taiwan were stratified into "Northern", "Central", and "Southern" parts with these 3 strata, then each subdivided into a further 3 strata based on population density. The household interview of the survey was arranged such that effect of seasonal variation was taken into account. A total of 1,937 persons completed the interview and 2,432 persons completed the health exam. The following data were collected: (1) Interview data : household information, basic demographics, 24 hour dietary recall, food frequency and habit, knowledge, attitudes and practice, medical history, 36-item Short Form for generic health status, and physical activity. (2) Health exam data: blood sample for measurement of nutritional biochemical indicators and complete clinical chemistry profile, urine sample for urinary electrolytes, anthropometric measurements, ECG, blood pressure, body temperature, pulmonary function, and an osteoporosis assessment. Data from the survey were analyzed using SUDAAN to adjust for the design effect and to obtain unbiased estimates of the mean, standard error and confidence intervals. Survey respondents were slightly younger compared to non-respondents; however, after weighting and adjustment with SUDAAN, the education levels and ethnicity of respondents and non-respondents were similar indicating lack of bias. We anticipate that the results of this survey will be of benefit in understanding the nutritional status of the elderly, the relationship between nutrition and health, and factors influencing elderly persons' nutritional status. Furthermore, this information could be used in the development of public health nutrition policy aimed at improving the nutrition and health of the elderly in Taiwan. PMID:16169830

  8. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data

    PubMed Central

    Lewis, Jesse S.; Gerber, Brian D.

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km2 of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10–120 cameras) and occasions (20–120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ?) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with low detection (i.e., bobcat and coyote) the most efficient sampling approach was to increase the number of occasions (survey days). However, for common species that are moderately detectable (i.e., cottontail rabbit and mule deer), occupancy could reliably be estimated with comparatively low numbers of cameras over a short sampling period. We provide general guidelines for reliably estimating occupancy across a range of terrestrial species (rare to common: ? = 0.175–0.970, and low to moderate detectability: p = 0.003–0.200) using motion-activated cameras. Wildlife researchers/managers with limited knowledge of the relative abundance and likelihood of detection of a particular species can apply these guidelines regardless of location. We emphasize the importance of prior biological knowledge, defined objectives and detailed planning (e.g., simulating different study-design scenarios) for designing effective monitoring programs and research studies. PMID:25210658

  9. Questionnaire survey of customer satisfaction for product categories towards certification of ergonomic quality in design.

    PubMed

    Mochimaru, Masaaki; Takahashi, Miwako; Hatakenaka, Nobuko; Horiuchi, Hitoshi

    2012-01-01

    Customer satisfaction was surveyed for 6 product categories (consumer electronics, daily commodities, home equipment, information systems, cars, and health appliances) by questionnaires based on the Analytic Hierarchy Process. Analyzing weight of evaluation factors, the 6 product categories were reorganized into 4 categories, those were related to 4 aspects in daily living that formed by two axes: home living - mobility life and healthy life - active communication. It was found that consumers were attracted by the actual user test by public institutes for all product categories. The certification based on the design process standard established by authorities, such as EQUID was the second best attractor for consumers. PMID:22316844

  10. Overview of Test Design: A Survey of Black Box Software Testing Techniques

    NSDL National Science Digital Library

    Cem Kaner

    2011-06-01

    This subset of the Black Box Software Testing collection includes resources for a broad survey of software test technique noticing different objectives, strengths, and blind spots. Materials present a few techniques more closely than the rest. Students will: gain familiarity with a variety of test techniques; learn structures for comparing objectives and strengths of different test techniques; use the Heuristic Test Strategy Model for test planning and design; and use concept mapping tools for test planning. Resources include lecture videos, slides, activities, suggested readings, and study guide materials.

  11. Survey and design of pictographic sign messages used in the hotel/motel industry

    E-print Network

    Nichols, Kenneth Aaron

    1992-01-01

    Part. Right - 0 0 Si nificance lave: Symbol (. 01), Age (. 01), Visits (. 01) Symbol Right 32. 5 % Response Wrong 67. 5 % Part. Right 0% N= 0 72. 5 % 27. 5 % 0% N=O 54 N Total - 160 GOVERNMENT EMPLOYEE DISCOUNTS ~Smbols: Right - 8 Wrong... OF SCIENCE May 1992 Major Subject: Industrial Engineering SURVEY AND DESIGN OF PICTOGRAPHIC SIGN MESSAGES USED IN THE HOTEuMOTEL INDUSTRY A Thesis by KENNETH AARON NICHOLS Approved as to style and content by: R. Dale Huchingson (Chair of Committee...

  12. The design and construction of an infrared detector for use with a highway traffic survey system

    E-print Network

    Mundkowsky, William Fredrick

    1961-01-01

    . Spectral Distribution of Energy for Perfect Emitters 21. The Effects of Stops 22. Cross Section of a Fabry-Perot Interference Filter Page LIST OF TABID Table 1. Figures of Merit 2. Data Summary Sheet for InSb PEN Detector 3. Properties of Artificial... Sapphire (Al 0 ) 2 3 INTRODUCTIOH The highway design engineer needs a reliable, compact, portable detector to be used with a traffic survey system. The present method of detection using the road. tube has disad. - vantages in that the road tube...

  13. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  14. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Technical Reports Server (NTRS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L; Guhathakurta, Puraga; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Wilmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Kirby, Evan N.; Lotz, Jennifer M.

    2013-01-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z approx. 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z approx. 1 via approx.90 nights of observation on the Keck telescope. The survey covers an area of 2.8 Sq. deg divided into four separate fields observed to a limiting apparent magnitude of R(sub AB) = 24.1. Objects with z approx. < 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted approx. 2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z approx. 1.45, where the [O ii] 3727 Ang. doublet lies in the infrared. The DEIMOS 1200 line mm(exp -1) grating used for the survey delivers high spectral resolution (R approx. 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z approx. 1, approaching approx. 5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z approx. 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far.

  15. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Astrophysics Data System (ADS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Connolly, A. J.; Kaiser, N.; Kirby, Evan N.; Lemaux, Brian C.; Lin, Lihwai; Lotz, Jennifer M.; Luppino, G. A.; Marinoni, C.; Matthews, Daniel J.; Metevier, Anne; Schiavon, Ricardo P.

    2013-09-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ~ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z ~ 1 via ~90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg2 divided into four separate fields observed to a limiting apparent magnitude of R AB = 24.1. Objects with z <~ 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted ~2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ~ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm-1 grating used for the survey delivers high spectral resolution (R ~ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z ~ 1, approaching ~5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z ~ 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far. Based on observations taken at the W. M. Keck Observatory, which is operated jointly by the University of California and the California Institute of Technology, and on observations made with the NASA/ESO Hubble Space Telescope, obtained from the data archives at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555, and from the Canadian Astronomy Data Centre.

  16. THE DEEP2 GALAXY REDSHIFT SURVEY: DESIGN, OBSERVATIONS, DATA REDUCTION, AND REDSHIFTS

    SciTech Connect

    Newman, Jeffrey A. [Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Cooper, Michael C. [Center for Galaxy Evolution, Department of Physics and Astronomy, University of California, Irvine, 4129 Frederick Reines Hall, Irvine, CA 92697 (United States); Davis, Marc [Department of Astronomy and Physics, University of California, 601 Campbell Hall, Berkeley, CA 94720 (United States); Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Harker, Justin J.; Lai, Kamson [UCO/Lick Observatory, University of California, 1156 High Street, Santa Cruz, CA 95064 (United States); Coil, Alison L. [Department of Physics, University of California, San Diego, La Jolla, CA 92093 (United States); Dutton, Aaron A. [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Finkbeiner, Douglas P. [Harvard-Smithsonian Center for Astrophysics, Harvard University, 60 Garden St., Cambridge, MA 02138 (United States); Gerke, Brian F. [Lawrence Berkeley National Laboratory, 1 Cyclotron Rd., MS 90R4000, Berkeley, CA 94720 (United States); Rosario, David J. [Max-Planck-Institut fuer Extraterrestrische Physik, Giessenbachstrasse, D-85748 Garching (Germany); Weiner, Benjamin J.; Willmer, C. N. A. [Steward Observatory, University of Arizona, 933 N. Cherry Ave., Tucson, AZ 85721-0065 (United States); Yan Renbin [Department of Physics and Astronomy, University of Kentucky, 505 Rose Street, Lexington, KY 40506-0055 (United States); Kassin, Susan A. [Astrophysics Science Division, Goddard Space Flight Center, Code 665, Greenbelt, MD 20771 (United States); Konidaris, N. P., E-mail: janewman@pitt.edu, E-mail: djm70@pitt.edu, E-mail: m.cooper@uci.edu, E-mail: mdavis@berkeley.edu, E-mail: faber@ucolick.org, E-mail: koo@ucolick.org, E-mail: raja@ucolick.org, E-mail: phillips@ucolick.org [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); and others

    2013-09-15

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z {approx} 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M{sub B} = -20 at z {approx} 1 via {approx}90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg{sup 2} divided into four separate fields observed to a limiting apparent magnitude of R{sub AB} = 24.1. Objects with z {approx}< 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted {approx}2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z {approx} 1.45, where the [O II] 3727 A doublet lies in the infrared. The DEIMOS 1200 line mm{sup -1} grating used for the survey delivers high spectral resolution (R {approx} 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z {approx} 1, approaching {approx}5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z {approx} 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far.

  17. Biochip array technology immunoassay performance and quantitative confirmation of designer piperazines for urine workplace drug testing.

    PubMed

    Castaneto, Marisol S; Barnes, Allan J; Concheiro, Marta; Klette, Kevin L; Martin, Thomas A; Huestis, Marilyn A

    2015-06-01

    Designer piperazines are emerging novel psychoactive substances (NPS) with few high-throughput screening methods for their identification. We evaluated a biochip array technology (BAT) immunoassay for phenylpiperazines (PNP) and benzylpiperazines (BZP) and analyzed 20,017 randomly collected urine workplace specimens. Immunoassay performance at recommended cutoffs was evaluated for PNPI (5 ?g/L), PNPII (7.5 ?g/L), and BZP (5 ?g/L) antibodies. Eight hundred forty positive and 206 randomly selected presumptive negative specimens were confirmed by liquid chromatography high-resolution mass spectrometry (LC-HRMS). Assay limits of detection for PNPI, PNPII, and BZP were 2.9, 6.3, and 2.1 ?g/L, respectively. Calibration curves were linear (R (2)?>?0.99) with upper limits of 42 ?g/L for PNPI/PNII and 100 ?g/L for BZP. Quality control samples demonstrated imprecision <19.3 %CV and accuracies 86.0-94.5 % of target. There were no interferences from 106 non-piperazine substances. Seventy-eight of 840 presumptive positive specimens (9.3 %) were LC-HRMS positive, with 72 positive for 1-(3-chlorophenyl)piperazine (mCPP), a designer piperazine and antidepressant trazodone metabolite. Of 206 presumptive negative specimens, one confirmed positive for mCPP (3.3 ?g/L) and one for BZP (3.6 ?g/L). BAT specificity (21.1 to 91.4 %) and efficiency (27.0 to 91.6 %) increased, and sensitivity slightly decreased (97.5 to 93.8 %) with optimized cutoffs of 25 ?g/L PNPI, 42 ?g/L PNPI, and 100 ?g/L BZP. A high-throughput screening method is needed to identify piperazine NPS. We evaluated performance of the Randox BAT immunoassay to identify urinary piperazines and documented improved performance when antibody cutoffs were raised. In addition, in randomized workplace urine specimens, all but two positive specimens contained mCPP and/or trazodone, most likely from legitimate medical prescriptions. Graphical Abstract Biochip array technology (BAT) immunoassay for designer piperazines detection in urine. In chemiluminescent immunoassay, the labeled-drug (antigen) competes with the drug in the urine. In the absence of drug, the labeled-drug binds to the antibody releasing an enzyme (horseradish peroxidase) to react with the substrate and producing chemiluminescence. The higher the drug concentration in urine, the weaker the chemiluminescent signal is produced. All presumptive positive specimens and randomly selected presumptive negative specimens were analyzed and confirmed by a liquid chromatography high-resolution mass spectrometry with limit of quantification of 2.5 or 5 ?g/L. PMID:25903022

  18. S-CANDELS: The Spitzer-Cosmic Assembly Near-Infrared Deep Extragalactic Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    E-print Network

    Ashby, M L N; Fazio, G G; Dunlop, J S; Egami, E; Faber, S M; Ferguson, H C; Grogin, N A; Hora, J L; Huang, J -S; Koekemoer, A M; Labbe, I; Wang, Z

    2015-01-01

    The Spitzer-Cosmic Assembly Deep Near-Infrared Extragalactic Legacy Survey (S-CANDELS; PI G. Fazio) is a Cycle 8 Exploration Program designed to detect galaxies at very high redshifts (z > 5). To mitigate the effects of cosmic variance and also to take advantage of deep coextensive coverage in multiple bands by the Hubble Space Telescope Multi-Cycle Treasury Program CANDELS, S-CANDELS was carried out within five widely separated extragalactic fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the HST Deep Field North, and the Extended Groth Strip. S-CANDELS builds upon the existing coverage of these fields from the Spitzer Extended Deep Survey (SEDS) by increasing the integration time from 12 hours to a total of 50 hours but within a smaller area, 0.16 square degrees. The additional depth significantly increases the survey completeness at faint magnitudes. This paper describes the S-CANDELS survey design, processing, and publicly-available data products. We present IRAC dual-...

  19. Survey of alternative gas turbine engine and cycle design. Final report

    SciTech Connect

    Lukas, H.

    1986-02-01

    In the period of the 1940's to 1960's much experimentation was performed in the areas of intercooling, reheat, and recuperation, as well as the use of low-grade fuels in gas turbines. The Electric Power Research Institute (EPRI), in an effort to document past experience which can be used as the basis for current design activities, commissioned a study to document alternate cycles and components used in gas turbine design. The study was performed by obtaining the important technical and operational criteria of the cycles through a literature search of published documents, articles, and papers. Where possible the information was augmented through dialogue with persons associated with those cycles and with the manufacturers. The survey indicated that many different variations of the simple open-cycle gas turbine plant were used. Many of these changes resulted in increases in efficiency over the low simple-cycle efficiency of that period. Metallurgy, as well as compressor and turbine design, limited the simple-cycle efficiency to the upper teens. The cycle modifications increased those efficiencies to the twenties and thirties. Advances in metallurgy as well as compressor and turbine design, coupled with the decrease in flue cost, stopped the development of these complex cycles. Many of the plants operated successfully for many years, and only because newer simple-cycle gas turbine plants and large steam plants had better heat rates were these units shutdown or put into stand-by service. 24 refs., 25 figs., 114 tabs.

  20. Methodology and Accuracy of Estimation of Quantitative Trait Loci Parameters in a Half-Sib Design Using Maximum Likelihood

    PubMed Central

    Mackinnon, M. J.; Weller, J. I.

    1995-01-01

    Maximum likelihood methods were developed for estimation of the six parameters relating to a marker-linked quantitative trait locus (QTL) segregating in a half-sib design, namely the QTL additive effect, the QTL dominance effect, the population mean, recombination between the marker and the QTL, the population frequency of the QTL alleles, and the within-family residual variance. The method was tested on simulated stochastic data with various family structures under two genetic models. A method for predicting the expected value of the likelihood was also derived and used to predict the lower bound sampling errors of the parameter estimates and the correlations between them. It was found that standard errors and confidence intervals were smallest for the population mean and variance, intermediate for QTL effects and allele frequency, and highest for recombination rate. Correlations among standard errors of the parameter estimates were generally low except for a strong negative correlation (r = -0.9) between the QTL's dominance effect and the population mean, and medium positive and negative correlations between the QTL's additive effect and, respectively, recombination rate (r = 0.5) and residual variance (r = -0.6). The implications for experimental design and method of analysis on power and accuracy of marker-QTL linkage experiments were discussed. PMID:8647408

  1. EVALUATION OF VISUAL SURVEY PROGRAMS FOR MONITORING COHO SALMON ESCAPEMENT IN

    E-print Network

    EVALUATION OF VISUAL SURVEY PROGRAMS FOR MONITORING COHO SALMON ESCAPEMENT IN RELATION Title of Research Project: Evaluation of Visual Survey Programs for Monitoring Coho Salmon Escapement/Approved: ___________________________________________ ii #12;ABSTRACT Canada's Wild Salmon Policy (WSP) requires that quantitative survey designs be used

  2. The Hawk-I UDS and GOODS Survey (HUGS): Survey design and deep K-band number counts

    E-print Network

    Fontana, A; Paris, D; Targett, T A; Boutsia, K; Castellano, M; Galametz, A; Grazian, A; McLure, R; Merlin, E; Pentericci, L; Wuyts, S; Almaini, O; Caputi, K; Chary, R R; Cirasuolo, M; Conselice, C J; Cooray, A; Daddi, E; Dickinson, M; Faber, S M; Fazio, G; Ferguson, H C; Giallongo, E; Giavalisco, M; Grogin, N A; Hathi, N; Koekemoer, A M; Koo, D C; Lucas, R A; Nonino, M; Rix, H W; Renzini, A; Rosario, D; Santini, P; Scarlata, C; Sommariva, V; Stark, D P; van der Wel, A; Vanzella, E; Wild, V; Yan, H; Zibetti, S

    2014-01-01

    We present the results of a new, ultra-deep, near-infrared imaging survey executed with the Hawk-I imager at the ESO VLT, of which we make all the data public. This survey, named HUGS (Hawk-I UDS and GOODS Survey), provides deep, high-quality imaging in the K and Y bands over the CANDELS UDS and GOODS-South fields. We describe here the survey strategy, the data reduction process, and the data quality. HUGS delivers the deepest and highest quality K-band images ever collected over areas of cosmological interest, and ideally complements the CANDELS data set in terms of image quality and depth. The seeing is exceptional and homogeneous, confined to the range 0.38"-0.43". In the deepest region of the GOODS-S field, (which includes most of the HUDF) the K-band exposure time exceeds 80 hours of integration, yielding a 1-sigma magnitude limit of ~28.0 mag/sqarcsec. In the UDS field the survey matches the shallower depth of the CANDELS images reaching a 1-sigma limit per sq.arcsec of ~27.3mag in the K band and ~28.3m...

  3. S-CANDELS: The Spitzer-Cosmic Assembly Near-Infrared Deep Extragalactic Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Astrophysics Data System (ADS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Dunlop, J. S.; Egami, E.; Faber, S. M.; Ferguson, H. C.; Grogin, N. A.; Hora, J. L.; Huang, J.-S.; Koekemoer, A. M.; Labbé, I.; Wang, Z.

    2015-06-01

    The Spitzer-Cosmic Assembly Deep Near-infrared Extragalactic Legacy Survey (S-CANDELS; PI G.Fazio) is a Cycle 8 Exploration Program designed to detect galaxies at very high redshifts (z\\gt 5). To mitigate the effects of cosmic variance and also to take advantage of deep coextensive coverage in multiple bands by the Hubble Space Telescope (HST) Multi-cycle Treasury Program CANDELS, S-CANDELS was carried out within five widely separated extragalactic fields: the UKIDSS Ultra-deep Survey, the Extended Chandra Deep Field South, COSMOS, the HST Deep Field North, and the Extended Groth Strip. S-CANDELS builds upon the existing coverage of these fields from the Spitzer Extended Deep Survey (SEDS), a Cycle 6 Exploration Program, by increasing the integration time from SEDS’ 12 hr to a total of 50 hr but within a smaller area, 0.16 deg2. The additional depth significantly increases the survey completeness at faint magnitudes. This paper describes the S-CANDELS survey design, processing, and publicly available data products. We present Infrared Array Camera (IRAC) dual-band 3.6+4.5 ? {{m}} catalogs reaching to a depth of 26.5 AB mag. Deep IRAC counts for the roughly 135,000 galaxies detected by S-CANDELS are consistent with models based on known galaxy populations. The increase in depth beyond earlier Spitzer/IRAC surveys does not reveal a significant additional contribution from discrete sources to the diffuse Cosmic Infrared Background (CIB). Thus it remains true that only roughly half of the estimated CIB flux from COBE/DIRBE is resolved.

  4. High-Resolution Linkage and Quantitative Trait Locus Mapping Aided by Genome Survey Sequencing: Building Up An Integrative Genomic Framework for a Bivalve Mollusc

    PubMed Central

    Jiao, Wenqian; Fu, Xiaoteng; Dou, Jinzhuang; Li, Hengde; Su, Hailin; Mao, Junxia; Yu, Qian; Zhang, Lingling; Hu, Xiaoli; Huang, Xiaoting; Wang, Yangfan; Wang, Shi; Bao, Zhenmin

    2014-01-01

    Genetic linkage maps are indispensable tools in genetic and genomic studies. Recent development of genotyping-by-sequencing (GBS) methods holds great promise for constructing high-resolution linkage maps in organisms lacking extensive genomic resources. In the present study, linkage mapping was conducted for a bivalve mollusc (Chlamys farreri) using a newly developed GBS method—2b-restriction site-associated DNA (2b-RAD). Genome survey sequencing was performed to generate a preliminary reference genome that was utilized to facilitate linkage and quantitative trait locus (QTL) mapping in C. farreri. A high-resolution linkage map was constructed with a marker density (3806) that has, to our knowledge, never been achieved in any other molluscs. The linkage map covered nearly the whole genome (99.5%) with a resolution of 0.41 cM. QTL mapping and association analysis congruously revealed two growth-related QTLs and one potential sex-determination region. An important candidate QTL gene named PROP1, which functions in the regulation of growth hormone production in vertebrates, was identified from the growth-related QTL region detected on the linkage group LG3. We demonstrate that this linkage map can serve as an important platform for improving genome assembly and unifying multiple genomic resources. Our study, therefore, exemplifies how to build up an integrative genomic framework in a non-model organism. PMID:24107803

  5. A survey of factors associated with the successful recognition of agonal breathing and cardiac arrest by 9-1-1 call takers: design and methodology

    PubMed Central

    Vaillancourt, Christian; Jensen, Jan L; Grimshaw, Jeremy; Brehaut, Jamie C; Charette, Manya; Kasaboski, Ann; Osmond, Martin; Wells, George A; Stiell, Ian G

    2009-01-01

    Background Cardiac arrest victims most often collapse at home, where only a modest proportion receives life-saving bystander cardiopulmonary resuscitation. As many as 40% of all sudden cardiac arrest victims have agonal or abnormal breathing in the first minutes following cardiac arrest. 9-1-1 call takers may wrongly interpret agonal breathing as a sign of life, and not initiate telephone cardiopulmonary resuscitation instructions. Improving 9-1-1 call takers' ability to recognize agonal breathing as a sign of cardiac arrest could result in improved bystander cardiopulmonary resuscitation and survival rates for out-of-hospital cardiac arrest victims. Methods/Design The overall goal of this study is to design and conduct a survey of 9-1-1 call takers in the province of Ontario to better understand the factors associated with the successful identification of cardiac arrest (including patients with agonal breathing) over the phone, and subsequent administration of cardiopulmonary resuscitation instructions to callers. This study will be conducted in three phases using the Theory of Planned Behaviour. In Phase One, we will conduct semi-structured qualitative interviews with a purposeful selection of 9-1-1 call takers from Ontario, and identify common themes and belief categories. In Phase Two, we will use the qualitative interview results to design and pilot a quantitative survey. In Phase Three, a final version of the quantitative survey will be administered via an electronic medium to all registered call takers in the province of Ontario. We will perform qualitative thematic analysis (Phase One) and regression modelling (Phases Two and Three), to determine direct and indirect relationship of behavioural constructs with intentions to provide cardiopulmonary resuscitation instructions. Discussion The results of this study will provide valuable insight into the factors associated with the successful recognition of agonal breathing and cardiac arrest by 9-1-1 call takers. This will guide future interventional studies, which may include continuing education and protocol changes, in order to help increase the number of callers appropriately receiving cardiopulmonary resuscitation instructions, and save the lives of more cardiac arrest victims. Trial registration Clinicaltrials.gov NCT00848588 PMID:19646269

  6. Developing an efficient modelling and data presentation strategy for ATDEM system comparison and survey design

    NASA Astrophysics Data System (ADS)

    Combrinck, Magdel

    2015-10-01

    Forward modelling of airborne time-domain electromagnetic (ATDEM) responses is frequently used to compare systems and design surveys for optimum detection of expected mineral exploration targets. It is a challenging exercise to display and analyse the forward modelled responses due to the large amount of data generated for three dimensional models as well as the system dependent nature of the data. I propose simplifying the display of ATDEM responses through using the dimensionless quantity of signal-to-noise ratios (signal:noise) instead of respective system units. I also introduce the concept of a three-dimensional signal:noise nomo-volume as an efficient tool to visually present and analyse large amounts of data. The signal:noise nomo-volume is a logical extension of the two-dimensional conductance nomogram. It contains the signal:noise values of all system time channels and components for various target depths and conductances integrated into a single interactive three-dimensional image. Responses are calculated over a complete survey grid and therefore include effects of system and target geometries. The user can interactively select signal:noise cut-off values on the nomo-volume and is able to perform visual comparisons between various system and target responses. The process is easy to apply and geophysicists with access to forward modelling airborne electromagnetic (AEM) and three-dimensional imaging software already possess the tools required to produce and analyse signal:noise nomo-volumes.

  7. Sampling design for the Birth in Brazil: National Survey into Labor and Birth.

    PubMed

    Vasconcellos, Mauricio Teixeira Leite de; Silva, Pedro Luis do Nascimento; Pereira, Ana Paula Esteves; Schilithz, Arthur Orlando Correa; Souza Junior, Paulo Roberto Borges de; Szwarcwald, Celia Landmann

    2014-08-01

    This paper describes the sample design for the National Survey into Labor and Birth in Brazil. The hospitals with 500 or more live births in 2007 were stratified into: the five Brazilian regions; state capital or not; and type of governance. They were then selected with probability proportional to the number of live births in 2007. An inverse sampling method was used to select as many days (minimum of 7) as necessary to reach 90 interviews in the hospital. Postnatal women were sampled with equal probability from the set of eligible women, who had entered the hospital in the sampled days. Initial sample weights were computed as the reciprocals of the sample inclusion probabilities and were calibrated to ensure that total estimates of the number of live births from the survey matched the known figures obtained from the Brazilian System of Information on Live Births. For the two telephone follow-up waves (6 and 12 months later), the postnatal woman's response probability was modelled using baseline covariate information in order to adjust the sample weights for nonresponse in each follow-up wave. PMID:25167189

  8. Designing HIGH-COST Medicine Hospital Surveys, Health Planning, and the Paradox of Progressive Reform

    PubMed Central

    2010-01-01

    Inspired by social medicine, some progressive US health reforms have paradoxically reinforced a business model of high-cost medical delivery that does not match social needs. In analyzing the financial status of their areas’ hospitals, for example, city-wide hospital surveys of the 1910s through 1930s sought to direct capital investments and, in so doing, control competition and markets. The 2 national health planning programs that ran from the mid-1960s to the mid-1980s continued similar strategies of economic organization and management, as did the so-called market reforms that followed. Consequently, these reforms promoted large, extremely specialized, capital-intensive institutions and systems at the expense of less complex (and less costly) primary and chronic care. The current capital crisis may expose the lack of sustainability of such a model and open up new ideas and new ways to build health care designed to meet people's health needs. PMID:20019312

  9. Spectroscopic Survey Telescope design. III - Optical support structure and overall configuration

    NASA Astrophysics Data System (ADS)

    Ray, F. B.

    1990-07-01

    The Universities of Texas and Penn State are working together on an Arecibo-type optical telescope to be utilized in a semitransit mode for spectroscopic survey work. Its optics include a spherical primary mirror, a 2-element all-reflecting Gregorian spherical aberration corrector, and a series of optical fibers that will transmit light to a family of spectrographs. An optical support structure is being developed to permit position adjustment in azimuth only. During an azimuth position change, the instrument's entire weight is borne by steel rollers bearing on a circular crane rail of standard section, with support loads transmitted to the telescope base through pneumatic springs. Extensive application of various analytical procedures and computer-aided engineering tools has effectively allowed the detailed examination of several design iterations, thereby increasing the probability of success in the realized structure.

  10. Hot rocket plume experiment - Survey and conceptual design. [of rhenium-iridium bipropellants

    NASA Technical Reports Server (NTRS)

    Millard, Jerry M.; Luan, Taylor W.; Dowdy, Mack W.

    1992-01-01

    Attention is given to a space-borne engine plume experiment study to fly an experiment which will both verify and quantify the reduced contamination from advanced rhenium-iridium earth-storable bipropellant rockets (hot rockets) and provide a correlation between high-fidelity, in-space measurements and theoretical plume and surface contamination models. The experiment conceptual design is based on survey results from plume and contamination technologists throughout the U.S. With respect to shuttle use, cursory investigations validate Hitchhiker availability and adaptability, adequate remote manipulator system (RMS) articulation and dynamic capability, acceptable RMS attachment capability, adequate power and telemetry capability, and adequate flight altitude and attitude/orbital capability.

  11. Quantitative Analysis of Solar Technologies For Net-Zero Design Affordable Homes Research Group, School of Architecture, McGill University

    E-print Network

    Barthelat, Francois

    .0 - 18.0% 2.00 Ribbon Silicon Panels (Thin Film Sheets) - roofing shingles - transparent spray - peelQuantitative Analysis of Solar Technologies For Net-Zero Design Affordable Homes Research Group PRINCIPLES & RESULTS CONCLUSIONS Photovoltaic (PV) Energy Production Water-Based Solar Thermal Collectors Air

  12. Quantitative Assessment of a Senge Learning Organization Intervention

    ERIC Educational Resources Information Center

    Kiedrowski, P. Jay

    2006-01-01

    Purpose: To quantitatively assess a Senge learning organization (LO) intervention to determine if it would result in improved employee satisfaction. Design/methodology/approach: A Senge LO intervention in Division 123 of Company ABC was undertaken in 2000. Three employee surveys using likert-scale questions over five years and correlation analysis…

  13. A systematic quantitative approach to rational drug design and discovery of novel human carbonic anhydrase IX inhibitors.

    PubMed

    Sethi, Kalyan K; Verma, Saurabh M

    2014-08-01

    Drug design involves the design of small molecules that are complementary in shape and charge to the biomolecular target with which they interact and therefore will bind to it. Three-dimensional quantitative structure-activity relationship (3D-QSAR) studies were performed for a series of carbonic anhydrase IX inhibitors using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) techniques with the help of SYBYL 7.1 software. The large set of 36 different aromatic/heterocyclic sulfamates carbonic anhydrase (CA, EC 4.2.1.1) inhibitors, such as hCA IX, was chosen for this study. The conventional ligand-based 3D-QSAR studies were performed based on the low energy conformations employing database alignment rule. The ligand-based model gave q(2) values 0.802 and 0.829 and r(2) values 1.000 and 0.994 for CoMFA and CoMSIA, respectively, and the predictive ability of the model was validated. The predicted r(2) values are 0.999 and 0.502 for CoMFA and CoMSIA, respectively. SEA (steric, electrostatic, hydrogen bond acceptor) of CoMSIA has the significant contribution for the model development. The docking of inhibitors into hCA IX active site using Glide XP (Schrödinger) software revealed the vital interactions and binding conformation of the inhibitors. The CoMFA and CoMSIA field contour maps are well in agreement with the structural characteristics of the binding pocket of hCA IX active site, which suggests that the information rendered by 3D-QSAR models and the docking interactions can provide guidelines for the development of improved hCA IX inhibitors as leads for various types of metastatic cancers including those of cervical, renal, breast and head and neck origin. PMID:24090419

  14. Mechanical Design of NESSI: New Mexico Tech Extrasolar Spectroscopic Survey Instrument

    NASA Technical Reports Server (NTRS)

    Santoro, Fernando G.; Olivares, Andres M.; Salcido, Christopher D.; Jimenez, Stephen R.; Jurgenson, Colby A.; Hrynevych, Michael A.; Creech-Eakman, Michelle J.; Boston, Penny J.; Schmidt, Luke M.; Bloemhard, Heather; Rodeheffer, Dan; Vaive, Genevieve; Vasisht, Gautam; Swain, Mark R.; Deroo, Pieter

    2011-01-01

    NESSI: the New Mexico Tech Extrasolar Spectroscopic Survey Instrument is a ground-based multi-object spectrograph that operates in the near-infrared. It will be installed on one of the Nasmyth ports of the Magdalena Ridge Observatory (MRO) 2.4-meter Telescope sited in the Magdalena Mountains, about 48 km west of Socorro-NM. NESSI operates stationary to the telescope fork so as not to produce differential flexure between internal opto-mechanical components during or between observations. An appropriate mechanical design allows the instrument alignment to be highly repeatable and stable for both short and long observation timescales, within a wide-range of temperature variation. NESSI is optically composed of a field lens, a field de-rotator, re-imaging optics, an auto-guider and a Dewar spectrograph that operates at LN2 temperature. In this paper we report on NESSI's detailed mechanical and opto-mechanical design, and the planning for mechanical construction, assembly, integration and verification.

  15. Designing Input Fields for Non-Narrative Open-Ended Responses in Web Surveys

    PubMed Central

    Couper, Mick P.; Kennedy, Courtney; Conrad, Frederick G.; Tourangeau, Roger

    2012-01-01

    Web surveys often collect information such as frequencies, currency amounts, dates, or other items requiring short structured answers in an open-ended format, typically using text boxes for input. We report on several experiments exploring design features of such input fields. We find little effect of the size of the input field on whether frequency or dollar amount answers are well-formed or not. By contrast, the use of templates to guide formatting significantly improves the well-formedness of responses to questions eliciting currency amounts. For date questions (whether month/year or month/day/year), we find that separate input fields improve the quality of responses over single input fields, while drop boxes further reduce the proportion of ill-formed answers. Drop boxes also reduce completion time when the list of responses is short (e.g., months), but marginally increases completion time when the list is long (e.g., birth dates). These results suggest that non-narrative open questions can be designed to help guide respondents to provide answers in the desired format. PMID:23411468

  16. The Hawk-I UDS and GOODS Survey (HUGS): Survey design and deep K-band number counts

    NASA Astrophysics Data System (ADS)

    Fontana, A.; Dunlop, J. S.; Paris, D.; Targett, T. A.; Boutsia, K.; Castellano, M.; Galametz, A.; Grazian, A.; McLure, R.; Merlin, E.; Pentericci, L.; Wuyts, S.; Almaini, O.; Caputi, K.; Chary, R.-R.; Cirasuolo, M.; Conselice, C. J.; Cooray, A.; Daddi, E.; Dickinson, M.; Faber, S. M.; Fazio, G.; Ferguson, H. C.; Giallongo, E.; Giavalisco, M.; Grogin, N. A.; Hathi, N.; Koekemoer, A. M.; Koo, D. C.; Lucas, R. A.; Nonino, M.; Rix, H. W.; Renzini, A.; Rosario, D.; Santini, P.; Scarlata, C.; Sommariva, V.; Stark, D. P.; van der Wel, A.; Vanzella, E.; Wild, V.; Yan, H.; Zibetti, S.

    2014-10-01

    We present the results of a new, ultra-deep, near-infrared imaging survey executed with the Hawk-I imager at the ESO VLT, of which we make all the data (images and catalog) public. This survey, named HUGS (Hawk-I UDS and GOODS Survey), provides deep, high-quality imaging in the K and Y bands over the portions of the UKIDSS UDS and GOODS-South fields covered by the CANDELS HST WFC3/IR survey. In this paper we describe the survey strategy, the observational campaign, the data reduction process, and the data quality. We show that, thanks to exquisite image quality and extremely long exposure times, HUGS delivers the deepest K-band images ever collected over areas of cosmological interest, and in general ideally complements the CANDELS data set in terms of image quality and depth. In the GOODS-S field, the K-band observations cover the whole CANDELS area with a complex geometry made of 6 different, partly overlapping pointings, in order to best match the deep and wide areas of CANDELS imaging. In the deepest region (which includes most of the Hubble Ultra Deep Field) exposure times exceed 80 hours of integration, yielding a 1 - ? magnitude limit per square arcsec of ?28.0 AB mag. The seeing is exceptional and homogeneous across the various pointings, confined to the range 0.38-0.43 arcsec. In the UDS field the survey is about one magnitude shallower (to match the correspondingly shallower depth of the CANDELS images) but includes also Y-band band imaging (which, in the UDS, was not provided by the CANDELS WFC3/IR imaging). In the K-band, with an average exposure time of 13 hours, and seeing in the range 0.37-0.43 arcsec, the 1 - ? limit per square arcsec in the UDS imaging is ?27.3 AB mag. In the Y-band, with an average exposure time ?8 h, and seeing in the range 0.45-0.5 arcsec, the imaging yields a 1 - ? limit per square arcsec of ?28.3 AB mag. We show that the HUGS observations are well matched to the depth of the CANDELS WFC3/IR data, since the majority of even the faintest galaxies detected in the CANDELS H-band images are also detected in HUGS. Finally we present the K-band galaxy number counts produced by combining the HUGS data from the two fields. We show that the slope of the number counts depends sensitively on the assumed distribution of galaxy sizes, with potential impact on the estimated extra-galactic background light. All the HUGS images and catalogues are made publicly available at the ASTRODEEP website (http://www.astrodeep.eu) as well as from the ESO archive.Full Table 3 is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/570/A11

  17. Web-based surveys for corporate information gathering: a bias-reducing design framework

    Microsoft Academic Search

    JAKE BURKEY; WILLIAM L. KUECHLER

    2003-01-01

    The cost effectiveness of Internet-based communications in the ever more fully networked business environment continues to drive the use of Web surveys for corporate information gathering. However, simply applying traditional survey techniques to the Web can result in significant shortcomings in the data so gathered. Recent research has been directed at these issues, within the context of Web surveys as

  18. Optimal design of a lagrangian observing system for hydrodynamic surveys in coastal areas

    NASA Astrophysics Data System (ADS)

    Cucco, Andrea; Quattrocchi, Giovanni; Antognarelli, Fabio; Satta, Andrea; Maicu, Francesco; Ferrarin, Christian; Umgiesser, Georg

    2014-05-01

    The optimization of ocean observing systems is a pressing need for scientific research. In particular, the improvement of ocean short-term observing networks is achievable by reducing the cost-benefit ratio of the field campaigns and by increasing the quality of measurements. Numerical modeling is a powerful tool for determining the appropriateness of a specific observing system and for optimizing the sampling design. This is particularly true when observations are carried out in coastal areas and lagoons where, the use satellites is prohibitive due to the water shallowness. For such areas, numerical models are the most efficient tool both to provide a preliminary assess of the local physical environment and to make short -term predictions above its change. In this context, a test case experiment was carried out within an enclosed shallow water areas, the Cabras Lagoon (Sardinia, Italy). The aim of the experiment was to explore the optimal design for a field survey based on the use of coastal lagrangian buoys. A three-dimensional hydrodynamic model based on the finite element method (SHYFEM3D, Umgiesser et al., 2004) was implemented to simulate the lagoon water circulation. The model domain extent to the whole Cabras lagoon and to the whole Oristano Gulf, including the surrounding coastal area. Lateral open boundary conditions were provided by the operational ocean model system WMED and only wind forcing, provided by SKIRON atmospheric model (Kallos et al., 1997), was considered as surface boundary conditions. The model was applied to provide a number of ad hoc scenarios and to explore the efficiency of the short-term hydrodynamic survey. A first field campaign was carried out to investigate the lagrangian circulation inside the lagoon under the main wind forcing condition (Mistral wind from North-West). The trajectories followed by the lagrangian buoys and the estimated lagrangian velocities were used to calibrate the model parameters and to validate the simulation results. A set of calibration runs were performed and the model accuracy in reproducing the surface circulation were defined. Therefore, a numerical simulation was conducted to predict the wind induced lagoon water circulation and the paths followed by numerical particles inside the lagoon domain. The simulated particles paths was analyzed and the optimal configuration for the buoys deployment was designed in real-time. The selected deployment geometry was then tested during a further field campaign. The obtained dataset revealed that the chosen measurement strategy provided a near-synoptic survey with the longest records for the considered specific observing experiment. This work is aimed to emphasize the mutual usefulness of observations and numerical simulations in coastal ocean applications and it proposes an efficient approach to harmonize different expertise toward the investigation of a given specific research issue. A Cucco, M Sinerchia, A Ribotti, A Olita, L Fazioli, A Perilli, B Sorgente, M Borghini, K Schroeder, R Sorgente. 2012. A high-resolution real-time forecasting system for predicting the fate of oil spills in the Strait of Bonifacio (western Mediterranean Sea). Marine Pollution Bulletin. 64. 6, 1186-1200. Kallos, G., Nickovic, S., Papadopoulos, A., Jovic, D., Kakaliagou, O., Misirlis, N., Boukas, L., Mimikou, N., G., S., J., P., Anadranistakis, E., and Manousakis, M.. 1997. The regional weather forecasting system Skiron: An overview, in: Proceedings of the Symposium on Regional Weather Prediction on Parallel Computer Environments, 109-122, Athens, Greece. Umgiesser, G., Melaku Canu, D., Cucco, A., Solidoro, C., 2004. A finite element model for the Venice Lagoon. Development, set up, calibration and validation. Journal of Marine Systems 51, 123-145.

  19. Design Effects and Generalized Variance Functions for the 1990-91 Schools and Staffing Survey (SASS). Volume I. User's Manual.

    ERIC Educational Resources Information Center

    Salvucci, Sameena; And Others

    This user's manual summarizes the results and use of design effects and generalized variance functions (GVF) to approximate standard errors for the 1990-91 Schools and Staffing Survey (SASS). It is Volume I of a two-volume publication that is part of the Technical Report series of the National Center for Education Statistics (NCES). The SASS is a…

  20. Designing Anti-Influenza Aptamers: Novel Quantitative Structure Activity Relationship Approach Gives Insights into Aptamer – Virus Interaction

    PubMed Central

    Musafia, Boaz; Oren-Banaroya, Rony; Noiman, Silvia

    2014-01-01

    This study describes the development of aptamers as a therapy against influenza virus infection. Aptamers are oligonucleotides (like ssDNA or RNA) that are capable of binding to a variety of molecular targets with high affinity and specificity. We have studied the ssDNA aptamer BV02, which was designed to inhibit influenza infection by targeting the hemagglutinin viral protein, a protein that facilitates the first stage of the virus’ infection. While testing other aptamers and during lead optimization, we realized that the dominant characteristics that determine the aptamer’s binding to the influenza virus may not necessarily be sequence-specific, as with other known aptamers, but rather depend on general 2D structural motifs. We adopted QSAR (quantitative structure activity relationship) tool and developed computational algorithm that correlate six calculated structural and physicochemical properties to the aptamers’ binding affinity to the virus. The QSAR study provided us with a predictive tool of the binding potential of an aptamer to the influenza virus. The correlation between the calculated and actual binding was R2?=?0.702 for the training set, and R2?=?0.66 for the independent test set. Moreover, in the test set the model’s sensitivity was 89%, and the specificity was 87%, in selecting aptamers with enhanced viral binding. The most important properties that positively correlated with the aptamer’s binding were the aptamer length, 2D-loops and repeating sequences of C nucleotides. Based on the structure-activity study, we have managed to produce aptamers having viral affinity that was more than 20 times higher than that of the original BV02 aptamer. Further testing of influenza infection in cell culture and animal models yielded aptamers with 10 to 15 times greater anti-viral activity than the BV02 aptamer. Our insights concerning the mechanism of action and the structural and physicochemical properties that govern the interaction with the influenza virus are discussed. PMID:24846127

  1. Final Design of the CARMENES M-Dwarf Radial-Velocity Survey Instrument

    NASA Astrophysics Data System (ADS)

    Quirrenbach, Andreas; Amado, P.; Seifert, W.; Sánchez Carrasco, M. A.; Ribas, I.; Reiners, A.; Mandel, H.; Caballero, J. A.; Mundt, R.; Galadí, D.; Consortium, CARMENES

    2013-01-01

    CARMENES (Calar Alto high-Resolution search for M dwarfs with Exo-earths with Near-infrared and optical Echelle Spectrographs) is a next-generation instrument being built for the 3.5m telescope at the Calar Alto Observatory by a consortium of eleven Spanish and German institutions. CARMENES will conduct a five-year exoplanet survey targeting ~300 M dwarfs. The CARMENES instrument consists of two separate échelle spectrographs covering the wavelength range from 0.55 to 1.7 ?m at a spectral resolution of R = 82,000, fed by fibers from the Cassegrain focus of the telescope. For late-M spectral types, the wavelength range around 1.0 ?m (Y band) is the most important wavelength region for radial velocity work. Therefore, the efficiency of CARMENES will be optimized in this range. Since CCDs do not provide high enough efficiency around 1.0 ?m and no signal at all beyond the Si cutoff at 1.1 ?m, a near-IR detector is required. It is thus natural to adopt an instrument concept with two spectrographs, one equipped with a CCD for the range 0.55-1.05 ?m, and one with HgCdTe detectors for the range from 0.9-1.7 ?m. Each spectrograph will be coupled to the 3.5m telescope with its own optical fiber. The front end will contain a dichroic beam splitter and an atmospheric dispersion corrector, to feed the light into the fibers leading to the spectrographs. Guiding is performed with a separate camera. Additional fibers are available for simultaneous injection of light from emission line lamps for RV calibration. The spectrographs are mounted on benches inside vacuum tanks located in the coudé laboratory of the 3.5m dome. Each vacuum tank is equipped with a temperature stabilization system capable of keeping the temperature constant to within ±0.01K over 24h. The visible-light spectrograph will be operated near room temperature, the NIR spectrograph will be cooled to 140K. The CARMENES instrument passed its preliminary design review in July 2011; the final design is just being completed. Commissioning of the instrument is planned for the first half of 2014. At least 600 useable nights have been allocated at the Calar Alto 3.5m Telescope for the CARMENES survey in the time frame from 2014 to 2018.

  2. SIS Mixer Design for a Broadband Millimeter Spectrometer Suitable for Rapid Line Surveys and Redshift Determinations

    NASA Technical Reports Server (NTRS)

    Rice, F.; Sumner, M.; Zmuidzinas, J.; Hu, R.; LeDuc, H.; Harris, A.; Miller, D.

    2004-01-01

    We present some detail of the waveguide probe and SIS mixer chip designs for a low-noise 180-300 GHz double- sideband receiver with an instantaneous RF bandwidth of 24 GHz. The receiver's single SIS junction is excited by a broadband, fixed-tuned waveguide probe on a silicon substrate. The IF output is coupled to a 6-18 GHz MMIC low- noise preamplifier. Following further amplification, the output is processed by an array of 4 GHz, 128-channel analog autocorrelation spectrometers (WASP 11). The single-sideband receiver noise temperature goal of 70 Kelvin will provide a prototype instrument capable of rapid line surveys and of relatively efficient carbon monoxide (CO) emission line searches of distant, dusty galaxies. The latter application's goal is to determine redshifts by measuring the frequencies of CO line emissions from the star-forming regions dominating the submillimeter brightness of these galaxies. Construction of the receiver has begun; lab testing should begin in the fall. Demonstration of the receiver on the Caltech Submillimeter Observatory (CSO) telescope should begin in spring 2003.

  3. Improving the design of amphibian surveys using soil data: A case study in two wilderness areas

    USGS Publications Warehouse

    Bowen, K.D.; Beever, E.A.; Gafvert, U.B.

    2009-01-01

    Amphibian populations are known, or thought to be, declining worldwide. Although protected natural areas may act as reservoirs of biological integrity and serve as benchmarks for comparison with unprotected areas, they are not immune from population declines and extinctions and should be monitored. Unfortunately, identifying survey sites and performing long-term fieldwork within such (often remote) areas involves a special set of problems. We used the USDA Natural Resource Conservation Service Soil Survey Geographic (SSURGO) Database to identify, a priori, potential habitat for aquatic-breeding amphibians on North and South Manitou Islands, Sleeping Bear Dunes National Lakeshore, Michigan, and compared the results to those obtained using National Wetland Inventory (NWI) data. The SSURGO approach identified more target sites for surveys than the NWI approach, and it identified more small and ephemeral wetlands. Field surveys used a combination of daytime call surveys, night-time call surveys, and perimeter surveys. We found that sites that would not have been identified with NWI data often contained amphibians and, in one case, contained wetland-breeding species that would not have been found using NWI data. Our technique allows for easy a priori identification of numerous survey sites that might not be identified using other sources of spatial information. We recognize, however, that the most effective site identification and survey techniques will likely use a combination of methods in addition to those described here.

  4. Designing a Household Survey to Address Seasonality in Child Care Arrangements

    ERIC Educational Resources Information Center

    Schmidt, Stefanie R.; Wang, Kevin H.; Sonenstein, Freya L.

    2008-01-01

    In household telephone surveys, a long field period may be required to maximize the response rate and achieve adequate sample sizes. However, long field periods can be problematic when measures of seasonally affected behavior are sought. Surveys of child care use are one example because child care arrangements vary by season. Options include…

  5. Cluster Surveys

    E-print Network

    Isabella M. GIOIA

    2000-10-03

    I review some of the current efforts to create unbiased samples of galaxy clusters. Readers are referred elsewhere for general wide area sky surveys and redshift surveys, and for Sunyaev-Zeldovich, radio, infrared and submm surveys, some of which were not designed to search primarily for clusters. My focus will be on optical and X-ray samples and on high redshift clusters.

  6. The Sample Design of the 2002 Cyprus Survey of Consumer Finances

    E-print Network

    Karagrigoriou, Alex

    , Dr. M. Haliassos and Dr. C. Hassapis of the University of Cyprus and Dr. G. Kyriacou and Dr. G) and the Italian Survey of Household Income and Wealth. Guiso, Haliassos, and Jappelli have compared household

  7. Nutrition and Health Survey of Taiwan Elementary School Children 2001-2002: research design, methods and scope.

    PubMed

    Tu, Su-Hao; Hung, Yung-Tai; Chang, Hsing-Yi; Hang, Chi-Ming; Shaw, Ning-Sing; Lin, Wei; Lin, Yi-Chin; Hu, Su-Wan; Yang, Yao-Hsu; Wu, Tzee-Chung; Chang, Ya-Hui; Su, Shu-Chen; Hsu, Hsiao-Chi; Tsai, Keh-Sung; Chen, Ssu-Yuan; Yeh, Chih-Jung; Pan, Wen-Harn

    2007-01-01

    The "Nutrition and Health Survey of Taiwan's Elementary School Children (2001-2002)" was to investigate the nutritional status, influential dietary and non-dietary factors, health and development, and school performance, as well as the inter-relationships among these factors. The survey adopted a two-staged stratified, clustered probability sampling scheme. Towns and districts in Taiwan with particular ethnic and geographical characteristics were designated into 13 strata including Hakka areas, mountain areas, eastern Taiwan, the Penghu Islands, 3 northern regions, 3 central regions and 3 southern regions. Eight schools were selected from each stratum using the probabilities proportional to sizes method. Twenty-four pupils were randomly selected within each school. The survey included face-to-face interviews and health examinations. Taking seasonal effects into consideration, the face-to-face interviews were evenly allocated into each of the two semesters. A total of 2,419 face-to-face interviews and 2,475 health examinations were completed. Interview data included household information, socio-demographics, 24-hour dietary recall, food frequency, dietary and nutritional knowledge, attitudes and behaviors, physical activity, medical history, oral health, pubertal development, and bone health. Health exam data included anthropometry, blood pressure, physical fitness, bone density, dental health, and blood and urine collection. SUDAAN was used to adjust sampling design effect. There were no significant differences in sibling rank and parental characteristics between respondents and non-respondents, which indicates that our survey is representative and unbiased. The results of this survey will increase our understanding on the nutrition and health status of schoolchildren and can be used to shape public health policy in Taiwan. PMID:17723991

  8. A Meta-analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    NASA Astrophysics Data System (ADS)

    Zhang, Lin

    2013-07-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to different technology features? And how can pieces of qualitative and quantitative results be integrated to achieve a broader understanding of technology designs? To address these issues, this paper proposes a meta-analysis method. Detailed explanations about the structure of the methodology and its scientific mechanism are provided for discussions and suggestions. This paper ends with an in-depth discussion on the concerns and questions that educational researchers might raise, such as how this methodology takes care of learning contexts.

  9. Survey of waste package designs for disposal of high-level waste/spent fuel in selected foreign countries

    SciTech Connect

    Schneider, K.J.; Lakey, L.T.; Silviera, D.J.

    1989-09-01

    This report presents the results of a survey of the waste package strategies for seven western countries with active nuclear power programs that are pursuing disposal of spent nuclear fuel or high-level wastes in deep geologic rock formations. Information, current as of January 1989, is given on the leading waste package concepts for Belgium, Canada, France, Federal Republic of Germany, Sweden, Switzerland, and the United Kingdom. All but two of the countries surveyed (France and the UK) have developed design concepts for their repositories, but none of the countries has developed its final waste repository or package concept. Waste package concepts are under study in all the countries surveyed, except the UK. Most of the countries have not yet developed a reference concept and are considering several concepts. Most of the information presented in this report is for the current reference or leading concepts. All canisters for the wastes are cylindrical, and are made of metal (stainless steel, mild steel, titanium, or copper). The canister concepts have relatively thin walls, except those for spent fuel in Sweden and Germany. Diagrams are presented for the reference or leading concepts for canisters for the countries surveyed. The expected lifetimes of the conceptual canisters in their respective disposal environment are typically 500 to 1,000 years, with Sweden's copper canister expected to last as long as one million years. Overpack containers that would contain the canisters are being considered in some of the countries. All of the countries surveyed, except one (Germany) are currently planning to utilize a buffer material (typically bentonite) surrounding the disposal package in the repository. Most of the countries surveyed plan to limit the maximum temperature in the buffer material to about 100{degree}C. 52 refs., 9 figs.

  10. Power of Daughter and Granddaughter Designs for Determining Linkage Between Marker Loci and Quantitative Trait Loci in Dairy Cattle

    Microsoft Academic Search

    J. I. Weller; Y. Kashi; M. Soller

    1990-01-01

    ABSTRACT There is considerable interest in bo- vine DNA-level polymorphic marker loci as a means of mapping quantitative trait loci (QTL) of economic,importance in cattle. Progeny of a sire heterozygous for both a marker locus and a linked QTL, which,inherit different alleles for the marker, will have different trait means. Based on this, power to detect QTL, as , grandsire

  11. A survey of primary school teachers’ conceptions of force and motion

    Microsoft Academic Search

    Colin Kruger; Mike Summers; David Palacio

    1990-01-01

    This article reports the results of a survey designed to provide some quantitative evidence concerning the extent to which misconceptions about force and motion exist among primary school teachers. To the best of our knowledge, this is the first investigation of its kind carried out in the UK. The survey questionnaire contained statements based on those made by primary teachers

  12. Statistics of Local Public School Systems, Fall 1970: Staff. Elementary-Secondary General Information Survey Series.

    ERIC Educational Resources Information Center

    Hughes, Warren A.

    This publication is the fourth report in an annual survey series designed to provide reliable data on individual local public school systems for planning, policy, and research purposes. The report contains tables of national estimates and basic data tables providing quantitative staff data on the school systems in the survey. The data are derived…

  13. The FMOS-COSMOS survey of star-forming galaxies at z~1.6 III. Survey design, performance, and sample characteristics

    E-print Network

    Silverman, J D; Arimoto, N; Renzini, A; Rodighiero, G; Daddi, E; Sanders, D; Kartaltepe, J; Zahid, J; Nagao, T; Kewley, L J; Lilly, S J; Sugiyama, N; Capak, P; Carollo, C M; Chu, J; Hasinger, G; Ilbert, O; Kajisawa, M; Koekemoer, A M; Kovac, K; Fevre, O Le; Masters, D; McCracken, H J; Onodera, M; Scoville, N; Strazzullo, V; Taniguchi, Y

    2014-01-01

    We present a spectroscopic survey of galaxies in the COSMOS field using the Fiber Multi-Object Spectrograph (FMOS), a near-infrared instrument on the Subaru Telescope. Our survey is specifically designed to detect the Halpha emission line that falls within the H-band (1.6-1.8 micron) spectroscopic window from star-forming galaxies with M_stellar>10^10 Msolar and 1.4 < z < 1.7. With the high multiplex capabilities of FMOS, it is now feasible to construct samples of over one thousand galaxies having spectroscopic redshifts at epochs that were previously challenging. The high-resolution mode (R~2600) is implemented to effectively separate Halpha and [NII] emission lines thus enabling studies of gas-phase metallicity and photoionization conditions of the interstellar medium. The broad goals of our program are concerned with how star formation depends on stellar mass and environment, both recognized as drivers of galaxy evolution at lower redshifts. In addition to the main galaxy sample, our target selection...

  14. Automatica, Vol. 12, pp. 601-611. Pergamon Press, 1976. Printed in Great Britain A Survey of Design Methods for Failure

    E-print Network

    Willsky, Alan S.

    Automatica, Vol. 12, pp. 601-611. Pergamon Press, 1976. Printed in Great Britain A Survey of Design of sophisticated digital system design techniques that can greatly improve overall system performance. A good], Taylor [46], and Meyer and Cicolani [47]), where a great deal of effort is being put into the design

  15. Site study plan for EDBH (Engineering Design Boreholes) seismic surveys, Deaf Smith County site, Texas: Revision 1

    SciTech Connect

    Hume, H.

    1987-12-01

    This site study plan describes seismic reflection surveys to run north-south and east-west across the Deaf Smith County site, and intersecting near the Engineering Design Boreholes (EDBH). Both conventional and shallow high-resolution surveys will be run. The field program has been designed to acquire subsurface geologic and stratigraphic data to address information/data needs resulting from Federal and State regulations and Repository program requirements. The data acquired by the conventional surveys will be common-depth- point, seismic reflection data optimized for reflection events that indicate geologic structure near the repository horizon. The data will also resolve the basement structure and shallow reflection events up to about the top of the evaporite sequence. Field acquisition includes a testing phase to check/select parameters and a production phase. The field data will be subjected immediately to conventional data processing and interpretation to determine if there are any anamolous structural for stratigraphic conditions that could affect the choice of the EDBH sites. After the EDBH's have been drilled and logged, including vertical seismic profiling, the data will be reprocessed and reinterpreted for detailed structural and stratigraphic information to guide shaft development. The shallow high-resulition seismic reflection lines will be run along the same alignments, but the lines will be shorter and limited to immediate vicinity of the EDBH sites. These lines are planned to detect faults or thick channel sands that may be present at the EDBH sites. 23 refs. , 7 figs., 5 tabs.

  16. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  17. Quantitative molecular methods in virology

    Microsoft Academic Search

    M. Clementi; S. Menzo; A. Manzin; P. Bagnarelli

    1995-01-01

    Summary During the past few years, significant technical effort was made to develop molecular methods for the absolute quantitation of nucleic acids in biological samples. In virology, semi-quantitative and quantitative techniques of different principle, complexity, and reliability were designed, optimized, and applied in basic and clinical researches. The principal data obtained in successful pilot applications in vivo are reported in

  18. Predictors of intentions to quit smoking in Aboriginal tobacco smokers of reproductive age in regional New South Wales (NSW), Australia: quantitative and qualitative findings of a cross-sectional survey

    PubMed Central

    Gould, Gillian Sandra; Watt, Kerrianne; McEwen, Andy; Cadet-James, Yvonne; Clough, Alan R

    2015-01-01

    Objectives To assess the predictors of intentions to quit smoking in a community sample of Aboriginal smokers of reproductive age, in whom smoking prevalence is slow to decline. Design, setting and participants A cross-sectional survey involved 121 Aboriginal smokers, aged 18–45?years from January to May 2014, interviewed at community events on the Mid-North Coast NSW. Qualitative and quantitative data were collected on smoking and quitting attitudes, behaviours and home smoking rules. Perceived efficacy for quitting, and perceived threat from smoking, were uniquely assessed with a validated Risk Behaviour Diagnosis (RBD) Scale. Main outcome measures Logistic regression explored the impact of perceived efficacy, perceived threat and consulting previously with a doctor or health professional (HP) on self-reported intentions to quit smoking, controlling for potential confounders, that is, protection responses and fear control responses, home smoking rules, gender and age. Participants’ comments regarding smoking and quitting were investigated via inductive analysis, with the assistance of Aboriginal researchers. Results Two-thirds of smokers intended to quit within 3?months. Perceived efficacy (OR=4.8; 95% CI 1.78 to 12.93) and consulting previously with a doctor/HP about quitting (OR=3.82; 95% CI 1.43 to 10.2) were significant predictors of intentions to quit. ‘Smoking is not doing harm right now’ was inversely associated with quit intentions (OR=0.25; 95% CI 0.08 to 0.8). Among those who reported making a quit attempt, after consulting with a doctor/HP, 40% (22/60) rated the professional support received as low (0–2/10). Qualitative themes were: the negatives of smoking (ie, disgust, regret, dependence and stigma), health effects and awareness, quitting, denial, ‘smoking helps me cope’ and social aspects of smoking. Conclusions Perceived efficacy and consulting with a doctor/HP about quitting may be important predictors of intentions to quit smoking in Aboriginal smokers of reproductive age. Professional support was generally perceived to be low; thus, it could be improved for these Aboriginal smokers. Aboriginal participants expressed strong sentiments about smoking and quitting. PMID:25770232

  19. Iteratively constructive sequential design of experiments and surveys with nonlinear parameter-data relationships

    Microsoft Academic Search

    T. Guest; A. Curtis

    2009-01-01

    In experimental design, the main aim is to minimize postexperimental uncertainty on parameters by maximizing relevant information collected in a data set. Using an entropy-based method constructed on a Bayesian framework, it is possible to design experiments for highly nonlinear problems. However, the method is computationally infeasible for design spaces with even a few dimensions. We introduce an iteratively constructive

  20. Preliminary Findings from a Quantitative Study: Comparative Analysis of Students' Learning Outcomes During Co-ops, Research, and Capstone Design

    Microsoft Academic Search

    Olga Pierrakos; Maura Borrego

    Undergraduate research, co-ops, and capstone design are three culminating problem-based learning (PBL) experiences that are highly promoted in engineering education. Even though the benefits of such experiences are known, the core problem driving the proposed research is the limited empirical evidence that exists on students' learning outcomes as a result of participating in such experiences. The goal was to design

  1. Image resolution analysis: a new, robust approach to seismic survey design

    E-print Network

    Tzimeas, Constantinos

    2005-08-29

    ?guration, parameters such as the structure and seismic velocity also in?uence image resolution. Understanding their e?ect on image quality, allows us to better interpret the resolution results for the surveys under examination. A salt model was used to simulate...

  2. NATIONAL SURVEY OF ADOLESCENT WELL-BEING (NSCAW): A COMPARISON OF MODEL AND DESIGN BASED ANALYSES

    E-print Network

    Department of Statistics Carnegie Mellon University 2Department of Pediatrics Ohio State University March 6 with the child welfare system. This paper uses the NSCAW data to investigate the role of maternal depression. An investigation of the survey methodology and the actual data led to some manipulation of the data and assumptions

  3. A Feasibility Study of Longitudinal Design for Schools and Staffing Survey. Working Paper Series.

    ERIC Educational Resources Information Center

    Baker, David P.; Levine, Roger; Han, Mei; Garet, Michael

    This report summarizes tasks undertaken to assess the feasibility of a longitudinal analysis of elementary and secondary schools. The initial objective was to see if it was technically possible to analyze overlap samples of schools from the Schools and Staffing Survey (SASS) to produce substantive findings. Over the course of the project various…

  4. Designing future dark energy space missions. II. Photometric redshift of space weak lensing optimized surveys

    Microsoft Academic Search

    S. Jouvel; J.-P. Kneib; G. Bernstein; O. Ilbert; P. Jelinsky; B. Milliard; A. Ealet; C. Schimd; T. Dahlen; S. Arnouts

    2011-01-01

    Context. With the discovery of the accelerated expansion of the universe, different observational probes have been proposed to investigate the presence of dark energy, including possible modifications to the gravitation laws by accurately measuring the expansion of the Universe and the growth of structures. We need to optimize the return from future dark energy surveys to obtain the best results

  5. Designing and Mining Multi-Terabyte Astronomy Archives: The Sloan Digital Sky Survey

    Microsoft Academic Search

    Alexander S. Szalay; Peter Z. Kunszt; Ani Thakar; Jim Gray; Donald R. Slutz; Robert J. Brunner

    2000-01-01

    The next -generation astronomy digital archives will cover most of the sky at fine resolution in many wavelengths, from X-rays, through ultraviolet, optical, and infrared. The ar- chives will be stored at diverse geographical locations. One of the first of the se projects, the Sloan Digital Sky Survey (SDSS) is creating a 5 -wavelength catalog over 10,000 square degrees of

  6. Survey of light-water-reactor designs to be offered in the United States

    Microsoft Academic Search

    Spiewak

    1986-01-01

    ORNL has conducted a Nuclear Power Options Viability Study for the Department of Energy. That study is primarily concerned with new technology which could be developed for initial operation in the 2000 to 2010 time frame. Such technology would have to compete not only with coal options but with incrementally improved commercial light-water-reactors. This survey reported here was undertaken to

  7. Schools and Staffing Survey: Sample Design and Estimation. 1993-94. Technical/Methodology Report.

    ERIC Educational Resources Information Center

    Abramson, Robert; And Others

    The Schools and Staffing Survey (SASS) of the National Center for Education Statistics provides periodic, timely data on public and private elementary and secondary schools in the United States. Data collected include: school and teacher characteristics; school operations, programs, and policies; teacher demand and supply; and the opinions of…

  8. A design of strategic alliance based on value chain of surveying and mapping enterprises in China

    NASA Astrophysics Data System (ADS)

    Duan, Hong; Huang, Xianfeng

    2007-06-01

    In this paper, we use value chain and strategic alliance theories to analyzing the surveying and mapping Industry and enterprises. The value chain of surveying and mapping enterprises is highly-contacted but split by administrative interference, the enterprises are common small scale. According to the above things, we consider that establishing a nonequity- Holding strategic alliance based on value chain is an available way, it can not only let the enterprises share the superior resources in different sectors of the whole value chain each other but avoid offending the interests of related administrative departments, by this way, the surveying and mapping enterprises gain development respectively and totally. Then, we give the method to building up the strategic alliance model through parting the value chain and the using advantage of companies in different value chain sectors. Finally, we analyze the internal rule of strategic alliance and prove it is a suitable way to realize the development of surveying and mapping enterprises through game theory.

  9. The Results of the National Heritage Language Survey: Implications for Teaching, Curriculum Design, and Professional Development

    ERIC Educational Resources Information Center

    Carreira, Maria; Kagan, Olga

    2011-01-01

    This article reports on a survey of heritage language learners (HLLs) across different heritage languages (HLs) and geographic regions in the United States. A general profile of HLLs emerges as a student who (1) acquired English in early childhood, after acquiring the HL; (2) has limited exposure to the HL outside the home; (3) has relatively…

  10. The TRacking Adolescents' Individual Lives Survey (TRAILS): Design, Current Status, and Selected Findings

    ERIC Educational Resources Information Center

    Ormel, Johan; Oldehinkel, Albertine J.; Sijtsema, Jelle; van Oort, Floor; Raven, Dennis; Veenstra, Rene; Vollebergh, Wilma A. M.; Verhulst, Frank C.

    2012-01-01

    Objectives: The objectives of this study were as follows: to present a concise overview of the sample, outcomes, determinants, non-response and attrition of the ongoing TRacking Adolescents' Individual Lives Survey (TRAILS), which started in 2001; to summarize a selection of recent findings on continuity, discontinuity, risk, and protective…

  11. Designing Messaging to Engage Patients in an Online Suicide Prevention Intervention: Survey Results From Patients With Current Suicidal Ideation

    PubMed Central

    Lungu, Anita; Richards, Julie; Simon, Gregory E; Clingan, Sarah; Siler, Jaeden; Snyder, Lorilei; Ludman, Evette

    2014-01-01

    Background Computerized, Internet-delivered interventions can be efficacious; however, uptake and maintaining sustained client engagement are still big challenges. We see the development of effective engagement strategies as the next frontier in online health interventions, an area where much creative research has begun. We also argue that for engagement strategies to accomplish their purpose with novel targeted populations, they need to be tailored to such populations (ie, content is designed with the target population in mind). User-centered design frameworks provide a theoretical foundation for increasing user engagement and uptake by including users in development. However, deciding how to implement this approach to enage users in mental health intervention development is challenging. Objective The aim of this study was to get user input and feedback on acceptability of messaging content intended to engage suicidal individuals. Methods In March 2013, clinic intake staff distributed flyers announcing the study, “Your Feedback Counts” to potential participants (individuals waiting to be seen for a mental health appointment) together with the Patient Health Questionnaire. The flyer explained that a score of two or three (“more than half the days” or “nearly every day” respectively) on the suicide ideation question made them eligible to provide feedback on components of a suicide prevention intervention under development. The patient could access an anonymous online survey by following a link. After providing consent online, participants completed the anonymous survey. Results Thirty-four individuals provided data on past demographic information. Participants reported that they would be most drawn to an intervention where they knew that they were cared about, that was personalized, that others like them had found it helpful, and that included examples with real people. Participants preferred email invitations with subject lines expressing concern and availability of extra resources. Participants also provided feedback about a media prototype including a brand design and advertisement video for introducing the intervention. Conclusions This paper provides one model (including development of an engagement survey, audience for an engagement survey, methods for presenting results of an engagement survey) for including target users in the development of uptake strategies for online mental health interventions. PMID:24509475

  12. Household market for electric vehicles. Testing the hybrid household hypothesis: A reflexively designed survey of new-car-buying, multi-vehicle California households. Final report

    SciTech Connect

    Turrentine, T.; Kurani, K.

    1995-05-12

    This survey was sponsored by the Air Resources Board in order to assist the Board in its evaluation of the zero-emission vehicle (ZEV) regulation. The survey was designed to test the consumer marketability of electric vehicles (EVs). A total of 454 households in six California metropolitan areas were surveyed. The survey consisted of four main parts: (1) Assessment of household demographics, the number and type of household vehicles, and environmental attitudes; (2) Travel diaries to determine household travel patterns; (3) Informational video and news articles on EVs, and (4) Vehicle purchase offerings and choices.

  13. Longitudinal emittance: An introduction to the concept and survey of measurement techniques including design of a wall current monitor

    SciTech Connect

    Webber, R.C.

    1990-03-01

    The properties of charged particle beams associated with the distribution of the particles in energy and in time can be grouped together under the category of longitudinal emittance. This article is intended to provide an intuitive introduction to the concepts longitudinal emittance; to provide an incomplete survey of methods used to measure this emittance and the related properties of bunch length and momentum spread; and to describe the detailed design of a 6 Ghz bandwidth resistive wall current monitor useful for measuring bunch shapes of moderate to high intensity beams. Overall, the article is intended to be broad in scope, in most cases deferring details to cited original papers. 37 refs., 21 figs.

  14. An integrated device for magnetically-driven drug release and in situ quantitative measurements: Design, fabrication and testing

    NASA Astrophysics Data System (ADS)

    Bruvera, I. J.; Hernández, R.; Mijangos, C.; Goya, G. F.

    2015-03-01

    We have developed a device capable of remote triggering and in situ quantification of therapeutic drugs, based on magnetically-responsive hydrogels of poly (N-isopropylacrylamide) and alginate (PNiPAAm). The heating efficiency of these hydrogels measured by their specific power absorption (SPA) values showed that the values between 100 and 300 W/g of the material were high enough to reach the lower critical solution temperature (LCST) of the polymeric matrix within few minutes. The drug release through application of AC magnetic fields could be controlled by time-modulated field pulses in order to deliver the desired amount of drug. Using B12 vitamin as a concept drug, the device was calibrated to measure amounts of drug released as small as 25(2)×10-9 g, demonstrating the potential of this device for very precise quantitative control of drug release.

  15. Ethical considerations in the design and execution of the National and Hispanic Health and Nutrition Examination Survey (HANES).

    PubMed Central

    Wagener, D K

    1995-01-01

    The purpose of this article is to describe some ethical considerations that have arisen during the design and implementation of the health examination surveys conducted by the National Center for Health Statistics of the Centers for Disease Control and Prevention. Three major areas of concern are discussed: sharing information from the study, banking and using banked tissue samples, and obligations for future testing of subjects. Specific concerns of sharing information include: when to inform, whom to inform, maintaining confidentiality, and how to inform individuals. Specific concerns of determining when sera will be banked and using banked samples include: depletion of samples for quality control, obtaining informed consent for unanticipated uses, access by others, and requests for batches of samples. Finally, specific concerns regarding future testing of subjects include: retesting for verification, retesting for interpretation, testing for different risk factors, and follow-up. Although existing surveys can provide experience or even suggest guidelines, the uniqueness of any new survey will generate unique ethical problems, requiring the careful formulation of unique solutions. PMID:7635116

  16. Quantitative analysis

    PubMed Central

    Nevin, John A.

    1984-01-01

    Quantitative analysis permits the isolation of invariant relations in the study of behavior. The parameters of these relations can serve as higher-order dependent variables in more extensive analyses. These points are illustrated by reference to quantitative descriptions of performance maintained by concurrent schedules, multiple schedules, and signal-detection procedures. Such quantitative descriptions of empirical data may be derived from mathematical theories, which in turn can lead to novel empirical analyses so long as their terms refer to behavioral and environmental events. Thus, quantitative analysis is an integral aspect of the experimental analysis of behavior. PMID:16812400

  17. A survey of the state-of-the-art of design automation an invited presentation

    Microsoft Academic Search

    Melvin A. Breuer

    1982-01-01

    This paper is a brief overview to an invited talk presented at the 9th Annual Conference on Design Automation. The work presented is based upon an extensive study of the status of industrial and government design automation systems applied to digital systems, with primary emphasis on digital cards and LSI circuits. A detailed summary of our finds can be found

  18. On the design and control of mechatronic systems-a survey

    Microsoft Academic Search

    Rolf Isermann

    1996-01-01

    The integration of mechanical systems and microelectronics opens new possibilities for mechanical design and automatic functions. After a discussion of the mechanical and electronic design the organization of information processing in different levels is described. Within this frame “low-degree-intelligent” mechatronic systems can be developed which comprise adaptive control, supervision with fault diagnosis, and decisions with regard to further actions. This

  19. Survey and analysis of research on supersonic drag-due-to-lift minimization with recommendations for wing design

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Mann, Michael J.

    1992-01-01

    A survey of research on drag-due-to-lift minimization at supersonic speeds, including a study of the effectiveness of current design and analysis methods was conducted. The results show that a linearized theory analysis with estimated attainable thrust and vortex force effects can predict with reasonable accuracy the lifting efficiency of flat wings. Significantly better wing performance can be achieved through the use of twist and camber. Although linearized theory methods tend to overestimate the amount of twist and camber required for a given application and provide an overly optimistic performance prediction, these deficiencies can be overcome by implementation of recently developed empirical corrections. Numerous examples of the correlation of experiment and theory are presented to demonstrate the applicability and limitations of linearized theory methods with and without empirical corrections. The use of an Euler code for the estimation of aerodynamic characteristics of a twisted and cambered wing and its application to design by iteration are discussed.

  20. Aerodynamic aircraft design methods and their notable applications: Survey of the activity in Japan

    NASA Technical Reports Server (NTRS)

    Fujii, Kozo; Takanashi, Susumu

    1991-01-01

    An overview of aerodynamic aircraft design methods and their recent applications in Japan is presented. A design code which was developed at the National Aerospace Laboratory (NAL) and is in use now is discussed, hence, most of the examples are the result of the collaborative work between heavy industry and the National Aerospace Laboratory. A wide variety of applications in transonic to supersonic flow regimes are presented. Although design of aircraft elements for external flows are the main focus, some of the internal flow applications are also presented. Recent applications of the design code, using the Navier Stokes and Euler equations in the analysis mode, include the design of HOPE (a space vehicle) and Upper Surface Blowing (USB) aircraft configurations.

  1. A Survey of Applications and Research in Integrated Design Systems Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The initial part of the study was begun with a combination of literature searches, World Wide Web searches, and contacts with individuals and companies who were known to members of our team to have an interest in topics that seemed to be related to our study. There is a long list of such topics, such as concurrent engineering, design for manufacture, life-cycle engineering, systems engineering, systems integration, systems design, design systems, integrated product and process approaches, enterprise integration, integrated product realization, and similar terms. These all capture, at least in part, the flavor of what we describe here as integrated design systems. An inhibiting factor in this inquiry was the absence of agreed terminology for the study of integrated design systems. It is common for the term to be applied to what are essentially augmented Computer-Aided Design (CAD) systems, which are integrated only to the extent that agreements have been reached to attach proprietary extensions to proprietary CAD programs. It is also common for some to use the term integrated design systems to mean a system that applies only, or mainly, to the design phase of a product life cycle. It is likewise common for many of the terms listed earlier to be used as synonyms for integrated design systems. We tried to avoid this ambiguity by adopting the definition of integrated design systems that is implied in the introductory notes that we provided to our contacts, cited earlier. We thus arrived at this definition: Integrated Design Systems refers to the integration of the different tools and processes that comprise the engineering, of complex systems. It takes a broad view of the engineering of systems, to include consideration of the entire product realization process and the product life cycle. An important aspect of integrated design systems is the extent to which they integrate existing, "islands of automation" into a comprehensive design and product realization environment. As the study progressed, we relied increasingly upon a networking approach to lead us to new information. The departure point for such searches often was a government-sponsored project or a company initiative. The advantage of this approach was that short conversations with knowledgeable persons would usually cut through confusion over differences of terminology, thereby somewhat reducing the search space of the study. Even so, it was not until late in our eight-month inquiry that we began to see signs of convergence of the search, in the sense that a number of the latest inquiries began to turn up references to earlier contacts. As suggested above, this convergence often occurred with respect to particular government or company projects.

  2. Design and sample characteristics of the 2005-2008 Nutrition and Health Survey in Taiwan.

    PubMed

    Tu, Su-Hao; Chen, Cheng; Hsieh, Yao-Te; Chang, Hsing-Yi; Yeh, Chih-Jung; Lin, Yi-Chin; Pan, Wen-Harn

    2011-01-01

    The Nutrition and Health Survey in Taiwan (NAHSIT) 2005-2008 was funded by the Department of Health to provide continued assessment of health and nutrition of the people in Taiwan. This household survey collected data from children aged less than 6 years and adults aged 19 years and above, and adopted a three-stage stratified, clustered sampling scheme similar to that used in the NAHSIT 1993-1996. Four samples were produced. One sample with five geographical strata was selected for inference to the whole of Taiwan, while the other three samples, including Hakka, Penghu and mountainous areas were produced for inference to each cultural stratum. A total of 6,189 household interviews and 3,670 health examinations were completed. Interview data included household information, socio-demographics, 24-hour dietary recall, food frequency and habits, dietary and nutritional knowledge, attitudes and behaviors, physical activity, medical history and bone health. Health exam data included anthropometry, blood pressure, physical fitness, bone density, as well as blood and urine collection. Response rate for the household interview was 65%. Of these household interviews, 59% participated in the health exam. Only in a few age subgroups were there significant differences in sex, age, education, or ethnicity distribution between respondents and non-respondents. For the health exam, certain significant differences between participants and non-participants were mostly observed in those aged 19-64 years. The results of this survey will be of benefit to researchers, policy makers and the public to understand and improve the nutrition and health status of pre-school children and adults in Taiwan. PMID:21669592

  3. Experimental design approach for the optimisation of a HPLC-fluorimetric method for the quantitation of the angiotensin II receptor antagonist telmisartan in urine.

    PubMed

    Torrealday, N; González, L; Alonso, R M; Jiménez, R M; Ortiz Lastra, E

    2003-08-01

    A high performance liquid chromatographic method with fluorimetric detection has been developed for the quantitation of the angiotensin II receptor antagonist (ARA II) 4-((2-n-propyl-4-methyl-6-(1-methylbenzimidazol-2-yl)-benzimidazol-1-yl)methyl)biphenyl-2-carboxylic acid (telmisartan) in urine, using a Novapak C18 column 3.9 x 150 mm, 4 microm. The mobile phase consisted of a mixture acetonitrile-phosphate buffer (pH 6.0, 5 mM) (45:55, v/v) pumped at a flow rate of 0.5 ml min(-1). Effluent was monitored at excitation and emission wavelengths of 305 and 365 nm, respectively. Separation was carried out at room temperature. Chromatographic variables were optimised by means of experimental design. A clean-up step was used for urine samples consisting of a solid-phase extraction procedure with C8 cartridges and methanol as eluent. This method proved to be accurate (RE from -12 to 6%), precise (intra- and inter-day coefficients of variation (CV) were lower than 8%) and sensitive enough (limit of quantitation (LOQ), ca. 1 microg l(-1)) to be applied to the determination of the active drug in urine samples obtained from hypertensive patients. Concentration levels of telmisartan at different time intervals (from 0 up to 36 h after oral intake) were monitored. PMID:12899971

  4. The design and characterization of a testing platform for quantitative evaluation of tread performance on multiple biological substrates.

    PubMed

    Sliker, Levin J; Rentschler, Mark E

    2012-09-01

    In this study, an experimental platform is developed to quantitatively measure the performance of robotic wheel treads in a dynamic environment. The platform imposes a dynamic driving condition for a single robot wheel, where the wheel is rotated on a translating substrate, thereby inducing slip. The normal force of the wheel can be adjusted mechanically, while the rotational velocity of the wheel and the translational velocity of the substrate can be controlled using an open-loop control system. Wheel slip and translational speed can be varied autonomously while wheel traction force is measured using a load cell. The testing platform is characterized by testing one micropatterned polydimethylsiloxane (PDMS) tread on three substrates (dry synthetic tissue, hydrated synthetic tissue, and excised porcine small bowel tissue), at three normal forces (0.10, 0.20, and 0.30 N), 13 slip ratios (-0.30 to 0.30 in increments of 0.05), and three translational speeds (2, 3, and 6 mm/s). Additionally, two wheels (micropatterned and smooth PDMS) are tested on beef liver at the same three normal forces and translational speeds for a tread comparison. An analysis of variance revealed that the platform can detect statistically significant differences between means when observing normal forces, translational speeds, slip ratios, treads, and substrates. The variance due to within (platform error, P = 1) and between trials (human error, P = 0.152) is minimal when compared to the normal force (P = 0.036), translational speed ( P = 0.059), slip ratio (P = 0), tread (P = 0.004), and substrate variances ( P = 0). In conclusion, this precision testing platform can be used to determine wheel tread performance differences on the three substrates and for each of the studied parameters. Future use of the platform could lead to an optimized micropattern-based mobility system, under given operating conditions, for implementation on a robotic capsule endoscope. PMID:22736689

  5. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    USGS Publications Warehouse

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are becoming increasingly popular for group size modeling. Choosing appropriate statistical distributions for modeling flock size data is fundamental to accurately estimating population summaries, determining required survey effort, and assessing and propagating uncertainty through decision-making processes.

  6. Model driven transformation between design models to system test models using UML: a survey

    Microsoft Academic Search

    Sundus Bukhari; Tabinda Waheed

    2010-01-01

    Satisfying the needs of customers is of high priority in any software development life cycle (SDLC) process. Giving value added software products to the customers is based on many factors like efficient requirement engineering, or design development but the success of any software product mainly depends upon how software testing is carried out efficiently. It is the testing phase in

  7. SURVEY OF THE EXISTING APPROACHES TO ASSESS AND DESIGN NATURAL VENTILATION AND NEED FOR FURTHER DEVELOPMENTS

    E-print Network

    Boyer, Edmond

    presents a review of the existing approaches to predict natural ventilation performance, including simple empirical models, nodal models (mono-zone and multi-zones), zonal models and CFD models. For each approach are given. The aim of the review is to identify the main practical limits of existing programs in designing

  8. A survey of systems approaches to green design with illustrations from the computer industry

    Microsoft Academic Search

    Marion A. Hersh

    1998-01-01

    Increased recognition of the importance of sustainable development is posing new challenges for industry. A systems approach provides a framework for reconciling the needs of sustainable development with all of the other demands on industry. The computer industry has been chosen to illustrate the application of a systems approach to green design in industry on account of both its increasing

  9. Toward AUV survey design for optimal coverage and localization using the cramer rao lower bound

    Microsoft Academic Search

    Ayoung Kim; Ryan M. Eustice

    2009-01-01

    This paper discusses an approach to using the Cramer Rao Lower Bound (CRLB) as a trajectory design tool for autonomous underwater vehicle (AUV) visual navigation. We begin with a discussion of Fisher Information as a measure of the lower bound of uncertainty in a simultaneous localization and mapping (SLAM) pose-graph. Treating the AUV trajectory as an non-random parameter, the Fisher

  10. Survey of piloting factors in V/STOL aircraft with implications for flight control system design

    NASA Technical Reports Server (NTRS)

    Ringland, R. F.; Craig, S. J.

    1977-01-01

    Flight control system design factors involved for pilot workload relief are identified. Major contributors to pilot workload include configuration management and control and aircraft stability and response qualities. A digital fly by wire stability augmentation, configuration management, and configuration control system is suggested for reduction of pilot workload during takeoff, hovering, and approach.

  11. A survey of design techniques for system-level dynamic power management

    Microsoft Academic Search

    Luca Benini; Alessandro Bogliolo; Giovanni De Micheli

    2000-01-01

    Dynamic power management (DPM) is a design methodology for dynamically reconfiguring systems to provide the requested services and performance levels with a minimum number of active components or a minimum load on such com- ponents. DPM encompasses a set of techniques that achieves energy-efficient computation by selectively turning off (or re- ducing the performance of) system components when they are

  12. On Quantitizing

    PubMed Central

    Sandelowski, Margarete; Voils, Corrine I.; Knafl, George

    2009-01-01

    Quantitizing, commonly understood to refer to the numerical translation, transformation, or conversion of qualitative data, has become a staple of mixed methods research. Typically glossed are the foundational assumptions, judgments, and compromises involved in converting disparate data sets into each other and whether such conversions advance inquiry. Among these assumptions are that qualitative and quantitative data constitute two kinds of data, that quantitizing constitutes a unidirectional process essentially different from qualitizing, and that counting is an unambiguous process. Among the judgments are deciding what and how to count. Among the compromises are balancing numerical precision with narrative complexity. The standpoints of “conditional complementarity,” “critical remediation,” and “analytic alternation” clarify the added value of converting qualitative data into quantitative form. PMID:19865603

  13. The C-Band All-Sky Survey (C-BASS): design and implementation of the northern receiver

    NASA Astrophysics Data System (ADS)

    King, O. G.; Jones, Michael E.; Blackhurst, E. J.; Copley, C.; Davis, R. J.; Dickinson, C.; Holler, C. M.; Irfan, M. O.; John, J. J.; Leahy, J. P.; Leech, J.; Muchovej, S. J. C.; Pearson, T. J.; Stevenson, M. A.; Taylor, Angela C.

    2014-03-01

    The C-Band All-Sky Survey is a project to map the full sky in total intensity and linear polarization at 5 GHz. The northern component of the survey uses a broad-band single-frequency analogue receiver fitted to a 6.1-m telescope at the Owens Valley Radio Observatory in California, USA. The receiver architecture combines a continuous-comparison radiometer and a correlation polarimeter in a single receiver for stable simultaneous measurement of both total intensity and linear polarization, using custom-designed analogue receiver components. The continuous-comparison radiometer measures the temperature difference between the sky and temperature-stabilized cold electrical reference loads. A cryogenic front-end is used to minimize receiver noise, with a system temperature of ?30 K in both linear polarization and total intensity. Custom cryogenic notch filters are used to counteract man-made radio frequency interference. The radiometer 1/f noise is dominated by atmospheric fluctuations, while the polarimeter achieves a 1/f noise knee frequency of 10 mHz, similar to the telescope azimuthal scan frequency.

  14. The C-Band All-Sky Survey (C-BASS): Design and implementation of the northern receiver

    E-print Network

    King, O G; Blackhurst, E J; Copley, C; Davis, R J; Dickinson, C; Holler, C M; Irfan, M O; John, J J; Leahy, J P; Leech, J; Muchovej, S J C; Pearson, T J; Stevenson, M A; Taylor, Angela C

    2013-01-01

    The C-Band All-Sky Survey (C-BASS) is a project to map the full sky in total intensity and linear polarization at 5 GHz. The northern component of the survey uses a broadband single-frequency analogue receiver fitted to a 6.1-m telescope at the Owens Valley Radio Observatory in California, USA. The receiver architecture combines a continuous-comparison radiometer and a correlation polarimeter in a single receiver for stable simultaneous measurement of both total intensity and linear polarization, using custom-designed analogue receiver components. The continuous-comparison radiometer measures the temperature difference between the sky and temperature-stabilized cold electrical reference loads. A cryogenic front-end is used to minimize receiver noise, with a system temperature of $\\approx 30$ K in both linear polarization and total intensity. Custom cryogenic notch filters are used to counteract man-made radio frequency interference. The radiometer $1/f$ noise is dominated by atmospheric fluctuations, while th...

  15. A Survey of the Role of Noncovalent Sulfur Interactions in Drug Design.

    PubMed

    Beno, Brett R; Yeung, Kap-Sun; Bartberger, Michael D; Pennington, Lewis D; Meanwell, Nicholas A

    2015-06-11

    Electron deficient, bivalent sulfur atoms have two areas of positive electrostatic potential, a consequence of the low-lying ?* orbitals of the C-S bond that are available for interaction with electron donors including oxygen and nitrogen atoms and, possibly, ?-systems. Intramolecular interactions are by far the most common manifestation of this effect, which offers a means of modulating the conformational preferences of a molecule. Although a well-documented phenomenon, a priori applications in drug design are relatively sparse and this interaction, which is often isosteric with an intramolecular hydrogen-bonding interaction, appears to be underappreciated by the medicinal chemistry community. In this Perspective, we discuss the theoretical basis for sulfur ?* orbital interactions and illustrate their importance in the context of drug design and organic synthesis. The role of sulfur interactions in protein structure and function is discussed and although relatively rare, intermolecular interactions between ligand C-S ?* orbitals and proteins are illustrated. PMID:25734370

  16. A design of sample survey for estimation of cocoa production in Western Nigeria 

    E-print Network

    Salami, Shakiru Okunola

    1973-01-01

    difficulties and cost. These attempts involve agricultural regions as 'strata', vill'ages as 'primary sampling units p and cocoa operators as 'secondary sampling units' The design enunciated in this thesis takes the f'ollowing factors into consideration: (i... those of yeasant, farmers, The sampling schemes in the area and list frames are identical. The schemes invo1ve the administrative Divisions as primary strata, the villages as primary sampling units, and cocoa orchards as secon- dary sampling units...

  17. Injury survey of a non-traditional ‘soft-edged’ trampoline designed to lower equipment hazards

    Microsoft Academic Search

    David B. Eager; Carl Scarrott; Jim Nixon; Keith Alexander

    2012-01-01

    In Australia trampolines contribute one quarter of all childhood play equipment injuries. The objective of this study was to gather and evaluate injury data from a non-traditional, ‘soft-edged’, consumer trampoline, where the design aimed to minimise injuries from the equipment and from falling off. The manufacturer of the non-traditional trampoline provided the University of Technology Sydney with their Australian customer

  18. The design and construction of an infrared detector for use with a highway traffic survey system 

    E-print Network

    Mundkowsky, William Fredrick

    1961-01-01

    energy that the target radiates, seems to be the best approach for s highway traffic detector. This system shou34 be an intermediate infrared type due to the wavelength band of interest. This determines the type of 4etector that may be employed... The Optics Spectral Filtering Spatial Filtering Design Considerations Test Procedures and. Results 20 CONCLUSION APPENDIX I Basic infrared Radiation Laws APPENDIX II Baclqpound Rad. iation APPENDIX III Atmospheric Transmission...

  19. The Netherlands Mental Health Survey and Incidence Study (NEMESIS): objectives and design

    Microsoft Academic Search

    R. V. Bijl; G. van Zessen; A. Ravelli; C. de Rijk; Y. Langendoen

    1998-01-01

    The article describes the objectives and design of a prospective study of the prevalence, incidence and course of psychiatric\\u000a disorders in a representative sample of non-institutionalized Dutch adults. A total of 7146 men and women aged 18–64, contacted\\u000a through a multistage sample of municipalities and households, were interviewed at home in 1996. The primary diagnostic instrument\\u000a was the CIDI, which

  20. Designing Future Dark Energy Space Missions: II. Photometric Redshift of Space Weak Lensing Optimized Survey

    E-print Network

    Jouvel, S; Bernstein, G; Ilbert, O; Jelinsky, P; Milliard, B; Ealet, A; Schimd, C; Dahlen, T; Arnouts, S

    2010-01-01

    Accurate weak-lensing analysis requires not only accurate measurement of galaxy shapes but also precise and unbiased measurement of galaxy redshifts. The photometric redshift technique appears as the only possibility to determine the redshift of the background galaxies used in the weak-lensing analysis. Using the photometric redshift quality, simple shape measurement requirements, and a proper sky model, we explore what could be an optimal weak-lensing dark energy mission based on FoM calculation. We found that photometric redshifts reach their best accurracy for the bulk of the faint galaxy population when filters have a resolution R~3.2. We show that an optimal mission would survey the sky through 8 filters using 2 cameras (visible and near infrared). Assuming a 5-year mission duration, a mirror size of 1.5m, a 0.5deg2 FOV with a visible pixel scale of 0.15", to maximize the Weak Lensing FoM, an optimal exposure time is found to be 4x200s per filter (at the Galactic poles) thus covering ~11000deg2 of the sk...

  1. The measurement of public opinion on abortion: the effects of survey design.

    PubMed

    Bumpass, L L

    1997-01-01

    A factorial experiment examined the effects of the wording and sequence of survey questions on the measurement of attitudes toward abortion. When a first-trimester pregnancy is specified, 55% of respondents agree that a woman should be able to obtain a legal abortion for any reason, compared with 44% when no pregnancy duration is stated. Specifying first-trimester pregnancies has little effect on the proportion of respondents who agree that abortion should be available for maternal health, fetal defects or rape, but it significantly increases the proportion who agree that a woman should be able to obtain an abortion if she is single, has financial constraints or wants no more children. When gestational lengths from one to six months are presented to respondents in ascending order, agreement that a woman should be able to obtain an abortion for any reason is lower for any given length of gestation than when pregnancy durations are presented in descending order. Forty-eight percent of respondents agree that abortion should be legal for any reason when that question is posed after a series of specific reasons; however, 60% do so when it is the first question in the sequence. The difference in agreement with abortion for any reason between Catholics and non-Baptist Protestants, and between Republicans and Democrats, is much smaller when the question is asked first than when it is presented last. PMID:9258650

  2. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  3. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    The Propulsion Flight Test Fixture at the NASA Dryden Flight Research Center is a unique test platform available for use on NASA's F-15B aircraft, tail number 836, as a modular host for a variety of aerodynamics and propulsion research. For future flight data from this platform to be valid, more information must be gathered concerning the quality of the airflow underneath the body of the F-15B at various flight conditions, especially supersonic conditions. The flow angularity and Mach number must be known at multiple locations on any test article interface plane for measurement data at these locations to be valid. To determine this prerequisite information, flight data will be gathered in the Rake Airflow Gauge Experiment using a custom-designed flowfield rake to probe the airflow underneath the F-15B at the desired flight conditions. This paper addresses the design considerations of the rake and probe assembly, including the loads and stress analysis using analytical methods, computational fluid dynamics, and finite element analysis. It also details the flow calibration procedure, including the completed wind-tunnel test and posttest data reduction, calibration verification, and preparation for flight-testing.

  4. Making Full Use of the Longitudinal Design of the Current Population Survey: Methods for Linking Records Across 16 Months *

    PubMed Central

    Drew, Julia A. Rivera; Flood, Sarah; Warren, John Robert

    2015-01-01

    Data from the Current Population Survey (CPS) are rarely analyzed in a way that takes advantage of the CPS’s longitudinal design. This is mainly because of the technical difficulties associated with linking CPS files across months. In this paper, we describe the method we are using to create unique identifiers for all CPS person and household records from 1989 onward. These identifiers—available along with CPS basic and supplemental data as part of the on-line Integrated Public Use Microdata Series (IPUMS)—make it dramatically easier to use CPS data for longitudinal research across any number of substantive domains. To facilitate the use of these new longitudinal IPUMS-CPS data, we also outline seven different ways that researchers may choose to link CPS person records across months, and we describe the sample sizes and sample retention rates associated with these seven designs. Finally, we discuss a number of unique methodological challenges that researchers will confront when analyzing data from linked CPS files.

  5. Toward an Effective Design Process: Enhancing Building Performance through Better Integration of Facility Management Perspectives in the Design Process 

    E-print Network

    Kalantari Hematabadi, Seyed Saleh

    2014-08-20

    ......................................................................................................... 62 3.4. Quantitative Method .............................................................................................. 62 3.4.1. Developing the Survey Questionnaire ............................................................ 62 3... 3. Design Development Spiral with Additional Respect to the Sustainability ..... 25 Figure 4. The Map of Managing the Building Design Process and 14 Toyota Way Principles...

  6. Design, methods and demographic findings of the DEMINVALL survey: a population-based study of Dementia in Valladolid, Northwestern Spain

    PubMed Central

    2012-01-01

    Background This article describes the rationale and design of a population-based survey of dementia in Valladolid (northwestern Spain). The main aim of the study was to assess the epidemiology of dementia and its subtypes. Prevalence of anosognosia in dementia patients, nutritional status, diet characteristics, and determinants of non-diagnosed dementia in the community were studied. The main sociodemographic, educational, and general health status characteristics of the study population are described. Methods Cross-over and cohort, population-based study. A two-phase door-to-door study was performed. Both urban and rural environments were included. In phase 1 (February 2009 – February 2010) 28 trained physicians examined a population of 2,989 subjects (age: ? 65 years). The seven-minute screen neurocognitive battery was used. In phase 2 (May 2009 – May 2010) 4 neurologists, 1 geriatrician, and 3 neuropsychologists confirmed the diagnosis of dementia and subtype in patients screened positive by a structured neurological evaluation. Specific instruments to assess anosognosia, the nutritional status and diet characteristics were used. Of the initial sample, 2,170 subjects were evaluated (57% female, mean age 76.5?±?7.8, 5.2% institutionalized), whose characteristics are described. 227 persons were excluded for various reasons. Among those eligible were 592 non-responders. The attrition bias of non-responders was lower in rural areas. 241 screened positive (11.1%). Discussion The survey will explore some clinical, social and health related life-style variables of dementia. The population size and the diversification of social and educational backgrounds will contribute to a better knowledge of dementia in our environment. PMID:22935626

  7. THE VIRUS-P EXPLORATION OF NEARBY GALAXIES (VENGA): SURVEY DESIGN, DATA PROCESSING, AND SPECTRAL ANALYSIS METHODS

    SciTech Connect

    Blanc, Guillermo A. [Observatories of the Carnegie Institution for Science, Pasadena, CA (United States); Weinzirl, Tim; Song, Mimi; Heiderman, Amanda; Gebhardt, Karl; Jogee, Shardha; Evans, Neal J. II; Kaplan, Kyle; Marinova, Irina; Vutisalchavakul, Nalin [Department of Astronomy, University of Texas at Austin, Austin, TX (United States); Van den Bosch, Remco C. E. [Max Planck Institute for Astronomy, Heidelberg (Germany); Luo Rongxin; Hao Lei [Shanghai Astronomical Observatory, Shanghai (China); Drory, Niv [Instituto de Astronomia, Universidad Nacional Autonoma de Mexico, Mexico, DF (Mexico); Fabricius, Maximilian [Max Planck Institute for Extraterrestrial Physics, Garching (Germany); Fisher, David [Department of Astronomy, University of Maryland, College Park, MD (United States); Yoachim, Peter [Astronomy Department, University of Washington, Seattle, WA (United States)

    2013-05-15

    We present the survey design, data reduction, and spectral fitting pipeline for the VIRUS-P Exploration of Nearby Galaxies (VENGA). VENGA is an integral field spectroscopic survey, which maps the disks of 30 nearby spiral galaxies. Targets span a wide range in Hubble type, star formation activity, morphology, and inclination. The VENGA data cubes have 5.''6 FWHM spatial resolution, {approx}5 A FWHM spectral resolution, sample the 3600 A-6800 A range, and cover large areas typically sampling galaxies out to {approx}0.7R{sub 25}. These data cubes can be used to produce two-dimensional maps of the star formation rate, dust extinction, electron density, stellar population parameters, the kinematics and chemical abundances of both stars and ionized gas, and other physical quantities derived from the fitting of the stellar spectrum and the measurement of nebular emission lines. To exemplify our methods and the quality of the data, we present the VENGA data cube on the face-on Sc galaxy NGC 628 (a.k.a. M 74). The VENGA observations of NGC 628 are described, as well as the construction of the data cube, our spectral fitting method, and the fitting of the stellar and ionized gas velocity fields. We also propose a new method to measure the inclination of nearly face-on systems based on the matching of the stellar and gas rotation curves using asymmetric drift corrections. VENGA will measure relevant physical parameters across different environments within these galaxies, allowing a series of studies on star formation, structure assembly, stellar populations, chemical evolution, galactic feedback, nuclear activity, and the properties of the interstellar medium in massive disk galaxies.

  8. Survey design to maximize the volume of exploration of the InfiniTEM system when looking for discrete targets

    NASA Astrophysics Data System (ADS)

    Desmarais, Jacques K.; Smith, Richard S.

    2015-04-01

    A discrete conductor model was used to estimate the volume of influence of a dual transmitter loop ground time-domain electromagnetic system (the InfiniTEM system). A sphere model in locally uniform field was used to calculate the signal from a subsurface target where the currents are constrained to flow vertically. The noise was determined from two field surveys. The signal-to-noise ratio was determined at each subsurface target location and each receiver location. The sensitivity of the InfiniTEM system at each target location was defined as the maximum of the absolute value of the signal-to-noise ratio for the ensemble of receiver positions in the survey. The volume of influence is defined as the volume where all targets have a sensitivity greater than one. The manner in which volume of influence varies can be used to determine the optimal design parameters of an InfiniTEM survey. Our analysis reveals that the InfiniTEM system should be operated with a loop separation distance of 1.5 times the loop width (where width and separation are measured parallel to the traverse lines); and that there should be 4 traverse lines between the loops, corresponding to a traverse line spacing of 250 m for a loop width of 1000 m. For the purposes of delineating highly conductive targets, the optimal waveform parameters are a high duty cycle (in our case 0.75), a low base frequency (in our case 10 Hz), and measurements should be made in the B field domain. For the purposes of finding less conductive targets, the base frequency should be high (in our case 30 Hz), the duty cycle should be low (in our case 0.25), and measurements should be made in the ?B/?t domain. Our study confirms that the InfiniTEM system can detect a 100 m radius sphere at up to 925 m depth. We have determined that electromagnetic systems are most sensitive to bodies striking perpendicular to the traverse line. As well, we have confirmed that the InfiniTEM system is most effective at detecting vertical targets.

  9. Do We Need to Go Cellular? Assessing Political Media Consumption Using a Single-Frame Landline\\/Cellular Survey Design

    Microsoft Academic Search

    Megan R. Hill; John M. Tchernev; R. Lance Holbert

    2012-01-01

    Much research has been published on cellular phone only households and the challenges posed by cellular phones to traditional survey methodologies that attempt to generate representative samples using only landline telecommunications. This study reports analyses comparing two separate survey strata (Nlandline = 152, Ncellular = 153) collected simultaneously and nested within a single-frame survey of a state in the American Midwest for differences in

  10. Design, objectives, and lessons from a pilot 25 year follow up re-survey of survivors in the Whitehall study of London Civil Servants

    Microsoft Academic Search

    R. Clarke; E. Breeze; P. Sherliker; M. Shipley; L. Youngman; A. Fletcher; R. Fuhrer; D. Leon; S. Parish; R. Collins; M. Marmot

    1998-01-01

    DESIGN: To assess the feasibility of conducting a re-survey of men who are resident in the United Kingdom 25 years after enrollment in the Whitehall study of London Civil Servants. METHODS: A random sample of 401 study survivors resident in three health authority areas was selected for this pilot study. They were mailed a request to complete a self administered

  11. Tailings Pond Characterization And Designing Through Geophysical Surveys In Dipping Sedimentary Formations

    NASA Astrophysics Data System (ADS)

    Muralidharan, D.; Andrade, R.; Anand, K.; Sathish, R.; Goud, K.

    2009-12-01

    Mining activities results into generation of disintegrated waste materials attaining increased mobilization status and requires a safe disposal mechanism through back filling process or secluded storage on surface with prevention of its interaction with environment cycle. The surface disposal of waste materials will become more critical in case of mined minerals having toxic or radioactive elements. In such cases, the surface disposal site is to be characterized for its sub-surface nature to understand its role in environmental impact due to the loading of waste materials. Near surface geophysics plays a major role in mapping the geophysical characters of the sub-surface formations in and around the disposal site and even to certain extent helps in designing of the storage structure. Integrated geophysical methods involving resistivity tomography, ground magnetic and shallow seismic studies were carried out over proposed tailings pond area of 0.3 sq. kms underlined by dipping sedimentary rocks consisting of ferruginous shales and dolomitic to siliceous limestone with varying thicknesses. The investigated site being located in tectonically disturbed area, geophysical investigations were carried out with number of profiles to visualize the sub-surface nature with clarity. The integration of results of twenty profiles of resistivity tomography with 2 m (shallow) and 10 m (moderate depth) electrode spacing’s enabled in preparing probable sub-surface geological section along the strike direction of the formation under the tailings pond with some geo-tectonic structure inferred to be a fault. Similarly, two resistivity tomography profiles perpendicular to the strike direction of the formations brought out the existence of buried basic intrusive body on the northern boundary of the proposed tailings pond. Two resistivity tomography profiles in criss-cross direction over the suspected fault zone confirmed fault existence on the north-eastern part of tailings pond. Thirty two magnetic profiles inside the tailings pond and surrounding areas on the southern part of the tailings pond enabled in identifying two parallel east-west intrusive bodies forming the impermeable boundary for the tailings pond. The shallow seismic refraction and the geophysical studies in and around the proposed tailings pond brought out the suitability of the site, even when the toxic elements percolates through the subsurface formations in to the groundwater system, the existence of dykes on either side of the proposed ponding area won’t allow the water to move across them thus by restricting the contamination within the tailings pond area. Similarly, the delineation of a fault zone within the tailings pond area helped in shifting the proposed dam axis of the pond to avoid leakage through the fault zone causing concern to environment pollution.

  12. ESO imaging survey: optical deep public survey

    NASA Astrophysics Data System (ADS)

    Mignano, A.; Miralles, J.-M.; da Costa, L.; Olsen, L. F.; Prandoni, I.; Arnouts, S.; Benoist, C.; Madejsky, R.; Slijkhuis, R.; Zaggia, S.

    2007-02-01

    This paper presents new five passbands (UBVRI) optical wide-field imaging data accumulated as part of the DEEP Public Survey (DPS) carried out as a public survey by the ESO Imaging Survey (EIS) project. Out of the 3 square degrees originally proposed, the survey covers 2.75 square degrees, in at least one band (normally R), and 1.00 square degrees in five passbands. The median seeing, as measured in the final stacked images, is 0.97 arcsec, ranging from 0.75 arcsec to 2.0 arcsec. The median limiting magnitudes (AB system, 2´´ aperture, 5? detection limit) are UAB=25.65, BAB=25.54, VAB=25.18, RAB = 24.8 and IAB =24.12 mag, consistent with those proposed in the original survey design. The paper describes the observations and data reduction using the EIS Data Reduction System and its associated EIS/MVM library. The quality of the individual images were inspected, bad images discarded and the remaining used to produce final image stacks in each passband, from which sources have been extracted. Finally, the scientific quality of these final images and associated catalogs was assessed qualitatively by visual inspection and quantitatively by comparison of statistical measures derived from these data with those of other authors as well as model predictions, and from direct comparison with the results obtained from the reduction of the same dataset using an independent (hands-on) software system. Finally to illustrate one application of this survey, the results of a preliminary effort to identify sub-mJy radio sources are reported. To the limiting magnitude reached in the R and I passbands the success rate ranges from 66 to 81% (depending on the fields). These data are publicly available at CDS. Based on observations carried out at the European Southern Observatory, La Silla, Chile under program Nos. 164.O-0561, 169.A-0725, and 267.A-5729. Appendices A, B and C are only available in electronic form at http://www.aanda.org

  13. Application of Screening Experimental Designs to Assess Chromatographic Isotope Effect upon Isotope-Coded Derivatization for Quantitative Liquid Chromatography–Mass Spectrometry

    PubMed Central

    2015-01-01

    Isotope effect may cause partial chromatographic separation of labeled (heavy) and unlabeled (light) isotopologue pairs. Together with a simultaneous matrix effect, this could lead to unacceptable accuracy in quantitative liquid chromatography–mass spectrometry assays, especially when electrospray ionization is used. Four biologically relevant reactive aldehydes (acrolein, malondialdehyde, 4-hydroxy-2-nonenal, and 4-oxo-2-nonenal) were derivatized with light or heavy (d3-, 13C6-, 15N2-, or 15N4-labeled) 2,4-dinitrophenylhydrazine and used as model compounds to evaluate chromatographic isotope effects. For comprehensive assessment of retention time differences between light/heavy pairs under various gradient reversed-phase liquid chromatography conditions, major chromatographic parameters (stationary phase, mobile phase pH, temperature, organic solvent, and gradient slope) and different isotope labelings were addressed by multiple-factor screening using experimental designs that included both asymmetrical (Addelman) and Plackett–Burman schemes followed by statistical evaluations. Results confirmed that the most effective approach to avoid chromatographic isotope effect is the use of 15N or 13C labeling instead of deuterium labeling, while chromatographic parameters had no general influence. Comparison of the alternate isotope-coded derivatization assay (AIDA) using deuterium versus 15N labeling gave unacceptable differences (>15%) upon quantifying some of the model aldehydes from biological matrixes. On the basis of our results, we recommend the modification of the AIDA protocol by replacing d3-2,4-dinitrophenylhydrazine with 15N- or 13C-labeled derivatizing reagent to avoid possible unfavorable consequences of chromatographic isotope effects. PMID:24922593

  14. Geothermal energy as a source of electricity. A worldwide survey of the design and operation of geothermal power plants

    SciTech Connect

    DiPippo, R.

    1980-01-01

    An overview of geothermal power generation is presented. A survey of geothermal power plants is given for the following countries: China, El Salvador, Iceland, Italy, Japan, Mexico, New Zealand, Philippines, Turkey, USSR, and USA. A survey of countries planning geothermal power plants is included. (MHR)

  15. Stream Surveying

    NSDL National Science Digital Library

    Karen Williams

    The class breaks into groups of 3-4. Each week we go out to the same stream channel, and perform the stream survey activities in the USFS Harrelson document, one chapter per week. After the snow flies, collected data are used to calculate channel hydrologic and hydraulic data such as bankfull flow and critical shear stress. Designed for a geomorphology course

  16. Carleton Quantitative Inquiry, Reasoning, and Knowledge Initiative

    NSDL National Science Digital Library

    In an era increasingly awash with numbers, how can one parse it all out? How is it possible to separate the proverbial quantitative chaff from the valuable wheat? Carleton College has taken on this weighty matter with their Quantitative Inquiry, Reasoning, and Knowledge Initiative (QuIRK). On the site, visitors will find sections that include Curricular Materials, Quantitative Reasoning Assessment, and Program Design. Some key resources for educators include "10 Foundational Quantitative Reasoning Questions" and examples of assignments and courses designed to teach quantitative reasoning.

  17. Yeasts in floral nectar: a quantitative survey

    PubMed Central

    Herrera, Carlos M.; de Vega, Clara; Canto, Azucena; Pozo, María I.

    2009-01-01

    Background and Aims One peculiarity of floral nectar that remains relatively unexplored from an ecological perspective is its role as a natural habitat for micro-organisms. This study assesses the frequency of occurrence and abundance of yeast cells in floral nectar of insect-pollinated plants from three contrasting plant communities on two continents. Possible correlations between interspecific differences in yeast incidence and pollinator composition are also explored. Methods The study was conducted at three widely separated areas, two in the Iberian Peninsula (Spain) and one in the Yucatán Peninsula (Mexico). Floral nectar samples from 130 species (37–63 species per region) in 44 families were examined microscopically for the presence of yeast cells. For one of the Spanish sites, the relationship across species between incidence of yeasts in nectar and the proportion of flowers visited by each of five major pollinator categories was also investigated. Key Results Yeasts occurred regularly in the floral nectar of many species, where they sometimes reached extraordinary densities (up to 4 × 105 cells mm?3). Depending on the region, between 32 and 44 % of all nectar samples contained yeasts. Yeast cell densities in the order of 104 cells mm?3 were commonplace, and densities >105 cells mm?3 were not rare. About one-fifth of species at each site had mean yeast cell densities >104 cells mm?3. Across species, yeast frequency and abundance were directly correlated with the proportion of floral visits by bumble-bees, and inversely with the proportion of visits by solitary bees. Conclusions Incorporating nectar yeasts into the scenario of plant–pollinator interactions opens up a number of intriguing avenues for research. In addition, with yeasts being as ubiquitous and abundant in floral nectars as revealed by this study, and given their astounding metabolic versatility, studies focusing on nectar chemical features should carefully control for the presence of yeasts in nectar samples. PMID:19208669

  18. A novel approach to evaluate the extent and the effect of cross-contribution to the intensity of ions designating the analyte and the internal standard in quantitative GC-MS analysis

    Microsoft Academic Search

    Bud-Gen Chen; Chiung Dan Chang; Chia-Ting Wang; Yi-Jun Chen; Wei-Tun Chang; Sheng-Meng Wang; Ray H. Liu

    2008-01-01

    In gas chromatography-mass spectrometry methods of analysis adopting the analyte’s isotopic analog as the internal standard\\u000a (IS), the cross-contribution (CC) phenomenon—contribution of IS to the intensities of the ions designating the analyte, and\\u000a vice versa—has been demonstrated to affect the quantitation data. A novel approach based on the deviations of the empirically\\u000a observed concentrations of a set of standards was

  19. Quantitative Analysen

    NASA Astrophysics Data System (ADS)

    Hübner, Philipp

    Der heilige Gral jeglicher Analytik ist, den wahren Wert bestimmen zu können. Dies bedingt quantitative Messmethoden, welche in der molekularen Analytik nun seit einiger Zeit zur Verfügung stehen. Das generelle Problem bei der Quantifizierung ist, dass wir meistens den wahren Wert weder kennen noch bestimmen können! Aus diesem Grund behelfen wir uns mit Annäherungen an den wahren Wert, indem wir aus Laborvergleichsuntersuchungen den Median oder den (robusten) Mittelwert berechnen oder indem wir einen Erwartungswert (expected value) aufgrund der Herstellung des Probenmaterials berechnen. Bei diesen Versuchen der Annäherung an den wahren Wert findet beabsichtigterweise eine Normierung der Analytik statt, entweder nach dem demokratischen Prinzip, dass die Mehrheit bestimmt oder durch zur Verfügungsstellung von geeignetem zertifiziertem Referenzmaterial. Wir müssen uns folglich bewusst sein, dass durch dieses Vorgehen zwar garantiert wird, dass die Mehrheit der Analysenlaboratorien gleich misst, wir jedoch dabei nicht wissen, ob alle gleich gut oder allenfalls gleich schlecht messen.

  20. Comparing Model-based and Design-based Structural Equation Modeling Approaches in Analyzing Complex Survey Data 

    E-print Network

    Wu, Jiun-Yu

    2011-10-21

    Conventional statistical methods assuming data sampled under simple random sampling are inadequate for use on complex survey data with a multilevel structure and non-independent observations. In structural equation modeling ...

  1. Rules for the preparation of manuscript and illustrations designed for publication by the United States Geological Survey

    USGS Publications Warehouse

    Hampson, Thomas

    1888-01-01

    In the annual report of the Director of the U. S. Geological Survey for 1885-'86, pages 40 and 41, you set forth the functions of the chief of the editorial division as follows: "To secure clear and accurate statement in the material sent to press, careful proof-reading, and uniformity in the details of book-making, as well as to assist the Director in exercising a general supervision over the publications of the Survey."

  2. EuropeaN Energy balance Research to prevent excessive weight Gain among Youth (ENERGY) project: Design and methodology of the ENERGY cross-sectional survey

    PubMed Central

    2011-01-01

    Background Obesity treatment is by large ineffective long term, and more emphasis on the prevention of excessive weight gain in childhood and adolescence is warranted. To inform energy balance related behaviour (EBRB) change interventions, insight in the potential personal, family and school environmental correlates of these behaviours is needed. Studies on such multilevel correlates of EBRB among schoolchildren in Europe are lacking. The ENERGY survey aims to (1) provide up-to-date prevalence rates of measured overweight, obesity, self-reported engagement in EBRBs, and objective accelerometer-based assessment of physical activity and sedentary behaviour and blood-sample biomarkers of metabolic function in countries in different regions of Europe, (2) to identify personal, family and school environmental correlates of these EBRBs. This paper describes the design, methodology and protocol of the survey. Method/Design A school-based cross-sectional survey was carried out in 2010 in seven different European countries; Belgium, Greece, Hungary, the Netherlands, Norway, Slovenia, and Spain. The survey included measurements of anthropometrics, child, parent and school-staff questionnaires, and school observations to measure and assess outcomes (i.e. height, weight, and waist circumference), EBRBs and potential personal, family and school environmental correlates of these behaviours including the social-cultural, physical, political, and economic environmental factors. In addition, a selection of countries conducted accelerometer measurements to objectively assess physical activity and sedentary behaviour, and collected blood samples to assess several biomarkers of metabolic function. Discussion The ENERGY survey is a comprehensive cross-sectional study measuring anthropometrics and biomarkers as well as assessing a range of EBRBs and their potential correlates at the personal, family and school level, among 10-12 year old children in seven European countries. This study will result in a unique dataset, enabling cross country comparisons in overweight, obesity, risk behaviours for these conditions as well as the correlates of engagement in these risk behaviours. PMID:21281466

  3. City Governments and Aging in Place: Community Design, Transportation and Housing Innovation Adoption

    ERIC Educational Resources Information Center

    Lehning, Amanda J.

    2012-01-01

    Purpose of the study: To examine the characteristics associated with city government adoption of community design, housing, and transportation innovations that could benefit older adults. Design and methods: A mixed-methods study with quantitative data collected via online surveys from 62 city planners combined with qualitative data collected via…

  4. MALAYSIAN FAMILY LIFE SURVEY

    EPA Science Inventory

    The Malaysian Family Life Surveys (MFLS) comprise a pair of surveys with partially overlapping samples, designed by RAND and administered in Peninsular Malaysia in 1976-77 (MFLS-1) and 1988-89 (MFLS-2). Each survey collected detailed current and retrospective information on famil...

  5. Computational vaccinology: quantitative approaches.

    PubMed

    Flower, Darren R; McSparron, Helen; Blythe, Martin J; Zygouri, Christianna; Taylor, Debra; Guan, Pingping; Wan, Shouzhan; Coveney, Peter V; Walshe, Valerie; Borrow, Persephone; Doytchinova, Irini A

    2003-01-01

    The immune system is hierarchical and has many levels, exhibiting much emergent behaviour. However, at its heart are molecular recognition events that are indistinguishable from other types of biomacromolecular interaction. These can be addressed well by quantitative experimental and theoretical biophysical techniques, and particularly by methods from drug design. We review here our approach to computational immunovaccinology. In particular, we describe the JenPep database and two new techniques for T cell epitope prediction. One is based on quantitative structure-activity relationships (a 3D-QSAR method based on CoMSIA and another 2D method based on the Free-Wilson approach) and the other on atomistic molecular dynamic simulations using high performance computing. JenPep (http://www.jenner.ar.uk/ JenPep) is a relational database system supporting quantitative data on peptide binding to major histocompatibility complexes, TAP transporters, TCR-pMHC complexes, and an annotated list of B cell and T cell epitopes. Our 2D-QSAR method factors the contribution to peptide binding from individual amino acids as well as 1-2 and 1-3 residue interactions. In the 3D-QSAR approach, the influence of five physicochemical properties (volume, electrostatic potential, hydrophobicity, hydrogen-bond donor and acceptor abilities) on peptide affinity were considered. Both methods are exemplified through their application to the well-studied problem of peptide binding to the human class I MHC molecule HLA-A*0201. PMID:14712934

  6. A WHOLE-LAKE WATER QUALITY SURVEY OF LAKE OAHE BASED ON A SPATIALLY-BALANCED PROBABILISTIC DESIGN

    EPA Science Inventory

    Assessing conditions on large bodies of water presets multiple statistical and logistical challenges. As part of the Upper Missouri River Program of the Environmental Monitoring and Assessment Project (EMAP) we surveyed water quality of Lake Oahe in July-August, 2002 using a spat...

  7. State-of-the-art survey of existing knowledge for the design of ground-source heat pumps

    NASA Astrophysics Data System (ADS)

    Ball, D. A.; Fischer, R. D.; Talbert, S. G.; Hodgett, D.; Auer, F.

    1983-11-01

    The gathering of design and performance information on historical and current ground coil heat pump systems, to assess the adequacy of available design methods, and to define near future R and D needs to promote the use of this technology is reported. The project was separated into two parts: (1) a review of the North American technology conducted by Battelle-Columbus; and (2) a review of European technology conducted by Battelle-Frankfurt. Descriptions of basic ground coil design configurations, operating experience, design methodologies, and reviews of costs of existing installations are included. It is found that further design method development efforts are necessary to provide installers and manufacturers with pertinent design information in order to stimulate further implementation of ground coupled heat pumps in the United States. A research effort is needed to develop parametric data on the design and performance of a ground coil during cooling using heat and moisture models.

  8. Electronic Survey Methodology Page 1 Electronic Survey Methodology

    E-print Network

    Nonnecke, Blair

    Electronic Survey Methodology Page 1 Electronic Survey Methodology: A Case Study in Reaching Hard, Maryland preece@umbc.edu 2002 © Andrews, Nonnecke and Preece #12;Electronic Survey Methodology Page 2 Conducting Research on the Internet: Electronic survey Design, Development and Implementation Guidelines

  9. SURVEY LEADERSHIP The Manager's Guide to Survey Feedback & Action Planning

    E-print Network

    Squire, Larry R.

    SURVEY LEADERSHIP The Manager's Guide to Survey Feedback & Action Planning A guide designed to help you understand and prepare to lead the survey feedback and action planning process, compiled is subject to change without notice. #12;Morehead Associates Page 2 of 115 Manager's Guide to Survey Feedback

  10. Effectiveness of Facebook Based Learning to Enhance Creativity among Islamic Studies Students by Employing Isman Instructional Design Model

    ERIC Educational Resources Information Center

    Alias, Norlidah; Siraj, Saedah; Daud, Mohd Khairul Azman Md; Hussin, Zaharah

    2013-01-01

    The study examines the effectiveness of Facebook based learning to enhance creativity among Islamic Studies students in the secondary educational setting in Malaysia. It describes the design process by employing the Isman Instructional Design Model. A quantitative study was carried out using experimental method and background survey. The…

  11. Design and Implementation of a Comprehensive Web-based Survey for Ovarian Cancer Survivorship with an Analysis of Prediagnosis Symptoms via Text Mining

    PubMed Central

    Sun, Jiayang; Bogie, Kath M; Teagno, Joe; Sun, Yu-Hsiang (Sam); Carter, Rebecca R; Cui, Licong; Zhang, Guo-Qiang

    2014-01-01

    Ovarian cancer (OvCa) is the most lethal gynecologic disease in the United States, with an overall 5-year survival rate of 44.5%, about half of the 89.2% for all breast cancer patients. To identify factors that possibly contribute to the long-term survivorship of women with OvCa, we conducted a comprehensive online Ovarian Cancer Survivorship Survey from 2009 to 2013. This paper presents the design and implementation of our survey, introduces its resulting data source, the OVA-CRADLE™ (Clinical Research Analytics and Data Lifecycle Environment), and illustrates a sample application of the survey and data by an analysis of prediagnosis symptoms, using text mining and statistics. The OVA-CRADLE™ is an application of our patented Physio-MIMI technology, facilitating Web-based access, online query and exploration of data. The prediagnostic symptoms and association of early-stage OvCa diagnosis with endometriosis provide potentially important indicators for future studies in this field. PMID:25861211

  12. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  13. Quantitative Activities for Introductory Astronomy

    Microsoft Academic Search

    Jonathan W. Keohane; J. L. Bartlett; J. P. Foy

    2010-01-01

    We present a collection of short lecture-tutorial (or homework) activities, designed to be both quantitative and accessible to the introductory astronomy student. Each of these involves interpreting some real data, solving a problem using ratios and proportionalities, and making a conclusion based on the calculation. Selected titles include: \\

  14. A survey of attitudes and factors associated with successful cardiopulmonary resuscitation (CPR) knowledge transfer in an older population most likely to witness cardiac arrest: design and methodology

    PubMed Central

    Vaillancourt, Christian; Grimshaw, Jeremy; Brehaut, Jamie C; Osmond, Martin; Charette, Manya L; Wells, George A; Stiell, Ian G

    2008-01-01

    Background Overall survival rates for out-of-hospital cardiac arrest rarely exceed 5%. While bystander cardiopulmonary resuscitation (CPR) can increase survival for cardiac arrest victims by up to four times, bystander CPR rates remain low in Canada (15%). Most cardiac arrest victims are men in their sixties, they usually collapse in their own home (85%) and the event is witnessed 50% of the time. These statistics would appear to support a strategy of targeted CPR training for an older population that is most likely to witness a cardiac arrest event. However, interest in CPR training appears to decrease with advancing age. Behaviour surrounding CPR training and performance has never been studied using well validated behavioural theories. Methods/Design The overall goal of this study is to conduct a survey to better understand the behavioural factors influencing CPR training and performance in men and women 55 years of age and older. The study will proceed in three phases. In phase one, semi-structured qualitative interviews will be conducted and recorded to identify common categories and themes regarding seeking CPR training and providing CPR to a cardiac arrest victim. The themes identified in the first phase will be used in phase two to develop, pilot-test, and refine a survey instrument based upon the Theory of Planned Behaviour. In the third phase of the project, the final survey will be administered to a sample of the study population over the telephone. Analyses will include measures of sampling bias, reliability of the measures, construct validity, as well as multiple regression analyses to identify constructs and beliefs most salient to seniors' decisions about whether to attend CPR classes or perform CPR on a cardiac arrest victim. Discussion The results of this survey will provide valuable insight into factors influencing the interest in CPR training and performance among a targeted group of individuals most susceptible to witnessing a victim in cardiac arrest. The findings can then be applied to the design of trials of various interventions designed to promote attendance at CPR classes and improve CPR performance. Trial registration ClinicalTrials.gov NCT00665288 PMID:18986547

  15. A screw theory basis for quantitative and graphical design tools that define layout of actuators to minimize parasitic errors in parallel flexure systems

    Microsoft Academic Search

    Jonathan B. Hopkins; Martin L. Culpepper

    2010-01-01

    In this paper we introduce a visual approach for placing actuators within multi-axis parallel flexure systems such that position and orientation errors are minimized. A stiffness matrix, which links twists and wrenches, is used to generate geometric shapes that guide designers in selecting optimal actuator locations and orientations. The geometric shapes, called actuation spaces, enable designers to (i) visualize the

  16. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  17. Digital PCR and Quantitation

    E-print Network

    Perkins, Richard A.

    Applied Genetics Digital PCR and Quantitation Ross Haynes Research Biologist, Applied Genetics Group Forensics@NIST 2012 Meeting Gaithersburg, MD November 28, 2012 #12;Applied Genetics Agenda · Why quantitate with qPCR? · How digital PCR Will Help Quantitation · Quantitative PCR versus Digital PCR

  18. Conduct a state-of-the-art survey of existing knowledge for the design of ground-source heat pumps

    NASA Astrophysics Data System (ADS)

    Ball, D.

    Historic and current methods for designing ground-coil heat pumps with emphasis on European and North American experiences are discussed. Approximately 27 individual design and performance evaluation method were studied with most of them employing computer techniques. Modeling categories include steady-state analytical and transient analytical lumped parameter finite difference and finite element in one, two, and three dimensions. A discussion of each is presented.

  19. The Math You Need, When You Need It: Student-Centered Web Resources Designed to Decrease Math Review and Increase Quantitative Geology in the Classroom

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Baer, E. M.

    2007-12-01

    Introductory geoscience courses are rife with quantitative concepts from graphing to rates to unit conversions. Recent research suggests that supplementary mathematical instruction increases post-secondary students' retention and performance in science courses. Nonetheless, many geoscience faculty feel that they do not have enough time to cover all the geoscience content, let alone covering the math they often feel students should have learned before reaching their classes. We present our NSF-funded effort to create web modules for students that address these concerns. Our web resources focus on both student performance and faculty time issues by building students' quantitative skills through web-based, self-paced modular tutorials. Each module can be assigned to individual students who have demonstrated on a pre-test that they are in need of supplemental instruction. The pre-test involves problems that place mathematical concepts in a geoscience context and determines the students who need the most support with these skills. Students needing support are asked to complete a three-pronged web-based module just before the concept is needed in class. The three parts of each tutorial include: an explanation of the mathematics, a page of practice problems and an on-line quiz that is graded and sent to the instructor. Each of the modules is steeped in best practices in mathematics and geoscience education, drawing on multiple contexts and utilizing technology. The tutorials also provide students with further resources so that they can explore the mathematics in more depth. To assess the rigor of this program, students are given the pre-test again at the end of the course. The uniqueness of this program lies in a rich combination of mathematical concepts placed in multiple geoscience contexts, giving students the opportunity to explore the way that math relates to the physical world. We present several preliminary modules dealing with topics common in introductory geoscience courses. We seek feedback from faculty teaching all levels of geoscience addressing several questions: In what math/geoscience topics do you feel students need supplemental instruction? Where do students come up against quantitative topics that make them drop the class or perform poorly? Would you be willing to review or help us to test these modules in your class?

  20. Sport Management Survey. Employment Perspectives.

    ERIC Educational Resources Information Center

    Quain, Richard J.; Parks, Janet B.

    1986-01-01

    A survey of sport management positions was designed to determine projected vacancy rates in six sport management career areas. Respondents to the survey were also questioned regarding their awareness of college professional preparation programs. Results are presented. (MT)

  1. Detecting population declines over large areas with presence-absence, time-to-encounter, and count survey methods.

    PubMed

    Pollock, Jacob E

    2006-06-01

    Ecologists often discount presence-absence surveys as a poor way to gain insight into population dynamics, in part because these surveys are not amenable to many standard statistical tests. Still, presence-absence surveys are sometimes the only feasible alternative for monitoring large areas when funds are limited, especially for sparse or difficult-to-detect species. I undertook a detailed simulation study to compare the power of presence-absence, count, and time-to-encounter surveys to detect regional declines in a population. I used a modeling approach that simulates both population numbers and the monitoring process, accounting for observation and other measurement errors. In gauging the efficacy of presence-absence surveys versus other approaches, I varied the number of survey sites, the spatial variation in encounter rate, the mean encounter rate, and the type of population loss. My results showed that presence-absence data can be as or more powerful than count data in many cases. Quantitative guidelines for choosing between presence-absence surveys and count surveys depend on the biological and logistical constraints governing a conservation monitoring situation. Generally, presence-absence surveys work best when there is little variability in abundance among the survey sites, the organism is rare, and the species is difficult to detect so that the time spent getting to each survey site is less than or equal to the time spent surveying each site. Count surveys work best otherwise. I present a case study with count data on the Northern Flicker (Colaptes auratus) from the North American Breeding Bird Survey to illustrate how the method might be used with field-survey data. The case study demonstrates that a count survey would be the most cost-effective design but would entail reduction in the number of sites. If this site reduction is not desirable, a presence-absence survey would be the most cost-effective survey. PMID:16909580

  2. OSCE Mission Survey

    NSDL National Science Digital Library

    Released on January 17 by the Organization for Security and Co-Operation in Europe (OSCE), this survey provides an overview of the mandates and other key information related to current OSCE field activities. Intended for practitioners involved in support of OSCE field activities as well as the interested public, the survey is designed to "facilitate reference to official OSCE documents and decisions on the subject." Users can read the survey by chapter in HTML format or in its entirety in .pdf format.

  3. Critical survey on the biomechanical criterion in the NIOSH method for the design and evaluation of manual lifting tasks

    Microsoft Academic Search

    Matthias Jäger; Alwin Luttmann

    1999-01-01

    In 1981, the National Institute for Occupational Safety and Health (NIOSH) published a comprehensive guide for the evaluation and design of manual lifting, based on epidemiological, physiological, psychophysical, and biomechanical knowledge. A revised version of the easy-to-use “NIOSH lifting equation” was provided in 1991 considering occasional new findings from literature. For assessing the load on the lumbar spine during lifting,

  4. Reflective Filters Design for Self-Filtering Narrowband Ultraviolet Imaging Experiment Wide-Field Surveys (NUVIEWS) Project

    NASA Technical Reports Server (NTRS)

    Park, Jung- Ho; Kim, Jongmin; Zukic, Muamer; Torr, Douglas G.

    1994-01-01

    We report the design of multilayer reflective filters for the self-filtering cameras of the NUVIEWS project. Wide angle self-filtering cameras were designed to image the C IV (154.9 nm) line emission, and H2 Lyman band fluorescence (centered at 161 nm) over a 20 deg x 30 deg field of view. A key element of the filter design includes the development of pi-multilayers optimized to provide maximum reflectance at 154.9 nm and 161 nm for the respective cameras without significant spectral sensitivity to the large cone angle of the incident radiation. We applied self-filtering concepts to design NUVIEWS telescope filters that are composed of three reflective mirrors and one folding mirror. The filters with narrowband widths of 6 and 8 rim at 154.9 and 161 nm, respectively, have net throughputs of more than 50 % with average blocking of out-of-band wavelengths better than 3 x 10(exp -4)%.

  5. Web Survey Design in ASP.Net 2.0: A Simple Task with One Line of Code

    ERIC Educational Resources Information Center

    Liu, Chang

    2007-01-01

    Over the past few years, more and more companies have been investing in electronic commerce (EC) by designing and implementing Web-based applications. In the world of practice, the importance of using Web technology to reach individual customers has been presented by many researchers. This paper presents an easy way of conducting marketing…

  6. GUIDELINES FOR ZOOPLANKTON SAMPLING IN QUANTITATIVE BASELINE AND MONITORING PROGRAMS

    EPA Science Inventory

    Methods applicable to zooplankton sampling and analysis in quantitative baseline and monitoring surveys are evaluated and summarized. Specific recommendations by managers must take into account characteristics of the water mass under investigation, the abundance of contained zoop...

  7. Surveying drainage culvert use by carnivores: sampling design and cost-benefit analyzes of track-pads vs. video-surveillance methods.

    PubMed

    Mateus, Ana Rita A; Grilo, Clara; Santos-Reis, Margarida

    2011-10-01

    Environmental assessment studies often evaluate the effectiveness of drainage culverts as habitat linkages for species, however, the efficiency of the sampling designs and the survey methods are not known. Our main goal was to estimate the most cost-effective monitoring method for sampling carnivore culvert using track-pads and video-surveillance. We estimated the most efficient (lower costs and high detection success) interval between visits (days) when using track-pads and also determined the advantages of using each method. In 2006, we selected two highways in southern Portugal and sampled 15 culverts over two 10-day sampling periods (spring and summer). Using the track-pad method, 90% of the animal tracks were detected using a 2-day interval between visits. We recorded a higher number of crossings for most species using video-surveillance (n = 129) when compared with the track-pad technique (n = 102); however, the detection ability using the video-surveillance method varied with type of structure and species. More crossings were detected in circular culverts (1 m and 1.5 m diameter) than in box culverts (2 m to 4 m width), likely because video cameras had a reduced vision coverage area. On the other hand, carnivore species with small feet such as the common genet Genetta genetta were detected less often using the track-pad surveying method. The cost-benefit analyzes shows that the track-pad technique is the most appropriate technique, but video-surveillance allows year-round surveys as well as the behavior response analyzes of species using crossing structures. PMID:21181260

  8. Use of Web and In-Person Survey Modes to Gather Data from Young Adults on Sex and Drug Use: An Evaluation of Cost, Time, and Survey Error Based on a Randomized Mixed-Mode Design

    ERIC Educational Resources Information Center

    McMorris, Barbara J.; Petrie, Renee S.; Catalano, Richard F.; Fleming, Charles B.; Haggerty, Kevin P.; Abbott, Robert D.

    2009-01-01

    In a randomized test of mixed-mode data collection strategies, 386 participants in the Raising Healthy Children (RHC) Project were either (a) asked to complete a survey via the Internet and later offered the opportunity to complete the survey in person or (b) first offered an in-person survey, with the Web follow-up. The Web-first condition…

  9. Measuring health literacy in populations: illuminating the design and development process of the European Health Literacy Survey Questionnaire (HLS-EU-Q)

    PubMed Central

    2013-01-01

    Background Several measurement tools have been developed to measure health literacy. The tools vary in their approach and design, but few have focused on comprehensive health literacy in populations. This paper describes the design and development of the European Health Literacy Survey Questionnaire (HLS-EU-Q), an innovative, comprehensive tool to measure health literacy in populations. Methods Based on a conceptual model and definition, the process involved item development, pre-testing, field-testing, external consultation, plain language check, and translation from English to Bulgarian, Dutch, German, Greek, Polish, and Spanish. Results The development process resulted in the HLS-EU-Q, which entailed two sections, a core health literacy section and a section on determinants and outcomes associated to health literacy. The health literacy section included 47 items addressing self-reported difficulties in accessing, understanding, appraising and applying information in tasks concerning decisions making in healthcare, disease prevention, and health promotion. The second section included items related to, health behaviour, health status, health service use, community participation, socio-demographic and socio-economic factors. Conclusions By illuminating the detailed steps in the design and development process of the HLS-EU-Q, it is the aim to provide a deeper understanding of its purpose, its capability and its limitations for others using the tool. By stimulating a wide application it is the vision that HLS-EU-Q will be validated in more countries to enhance the understanding of health literacy in different populations. PMID:24112855

  10. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  11. AIPS technology survey report

    NASA Technical Reports Server (NTRS)

    Ogletree, Glenn (editor)

    1984-01-01

    The results of a technology survey conducted for the NASA/JSC by the CSDL during Phase 1 of the NASA Advanced Information Processing System (AIPS) program at the CSDL are discussed. The purpose of the survey was to ensure that all technology relevant to the configuration, design, development, verification, implementation, and validation of an advanced information processing system, whether existing or under development and soon to be available, would be duly considered in the development of the AIPS. The emphasis in the survey was on technology items which were clearly relevant to the AIPS. Requirements were developed which guided the planning of contacts with the outside sources to be surveyed, and established practical limits on the scope and content of the Technology Survey. Subjects surveyed included architecture, software, hardware, methods for evaluation of reliability and performance, and methods for the verification of the AIPS design and the validation of the AIPS implementation. Survey requirements and survey results in each of these areas are presented, including analyses of the potential effects on the AIPS development process of using or not using the surveyed technology items. Another output of the survey was the identification of technology areas of particular relevance to the AIPS and for which further development, in some cases by the CSDL and in some cases by the NASA, would be fruitful. Appendices are provided in which are presented: (1) reports of some of the actual survey interactions with industrial and other outside information sources; (2) the literature list from the comprehensive literature survey which was conducted; (3) reduced-scale images of an excerpt ('Technology Survey' viewgraphs) from the set of viewgraphs used at the 14 April 1983 Preliminary Requirements Review by the CSDL for the NASA; and (4) reduced-scale images of the set of viewgraphs used in the AIPS Technology Survey Review presentation to the NASA monitors by the CSDL at the NASA Langley Research Center on 28 Sep. 1983.

  12. NATIONAL COMORBIDITY SURVEY (NCS)

    EPA Science Inventory

    The National Comorbidity Survey (NCS) was a collaborative epidemiologic investigation designed to study the prevalence and correlates of DSM III-R disorders and patterns and correlates of service utilization for these disorders. The NCS was the first survey to administer a struct...

  13. Three-dimensional quantitative structure-activity relationships and docking studies of some structurally diverse flavonoids and design of new aldose reductase inhibitors

    PubMed Central

    Chandra De, Utpal; Debnath, Tanusree; Sen, Debanjan; Debnath, Sudhan

    2015-01-01

    Aldose reductase (AR) plays an important role in the development of several long-term diabetic complications. Inhibition of AR activities is a strategy for controlling complications arising from chronic diabetes. Several AR inhibitors have been reported in the literature. Flavonoid type compounds are shown to have significant AR inhibition. The objective of this study was to perform a computational work to get an idea about structural insight of flavonoid type compounds for developing as well as for searching new flavonoid based AR inhibitors. The data-set comprising 68 flavones along with their pIC50 values ranging from 0.44 to 4.59 have been collected from literature. Structure of all the flavonoids were drawn in Chembiodraw Ultra 11.0, converted into corresponding three-dimensional structure, saved as mole file and then imported to maestro project table. Imported ligands were prepared using LigPrep option of maestro 9.6 version. Three-dimensional quantitative structure-activity relationships and docking studies were performed with appropriate options of maestro 9.6 version installed in HP Z820 workstation with CentOS 6.3 (Linux). A model with partial least squares factor 5, standard deviation 0.2482, R2 = 0.9502 and variance ratio of regression 122 has been found as the best statistical model. PMID:25709964

  14. An effective dietary survey framework for the assessment of total dietary arsenic intake in Bangladesh: part-A--FFQ design.

    PubMed

    Khan, Nasreen Islam; Owens, Gary; Bruce, David; Naidu, Ravi

    2009-04-01

    The accurate assessment of dietary intake patterns is important for the determination of total dietary arsenic (As) exposure in As-contaminated regions of Bangladesh. Food intake questionnaires are a common means of assessing food intake. A food frequency questionnaire (FFQ) was designed to assess the daily intake of frequently consumed food items and was successfully implemented to assess dietary patterns and intake of the rural populations in 18 villages from three Districts of Bangladesh (Comilla, Manikganj Sadar, and Munshiganj). The FFQ presented in this paper comprises a complete set of tools which allowed not only collection of information on dietary patterns but also information on the spatial characteristics of the landscape, socio-demographic indicators, and geographic locations of the identified environmental media of the contaminants, which resulted in As exposure to humans. The FFQ was designed in three sections: (1) general household information, (2) household water and rice information, and (3) individual dietary intake of other foods. The dietary intake of other food was then further subdivided into five different food subgroups: (i) grain intake, (ii) protein intake, (iii) fruit intake (iv), vegetable intake, and (v) dal (pulse) intake. During the design and development of the FFQ, emphasis was given to the source of food, the frequency (day/week/month) of consumption, and the daily amount of food consumed by each adult male, adult female, and child to accurately determine the dietary pattern and intake of arsenic in the rural population of Bangladesh. PMID:19172402

  15. The Personal Health Survey

    ERIC Educational Resources Information Center

    Thorne, Frederick C.

    1978-01-01

    The Personal Health Survey (PHS) is a 200-item inventory designed to sample symptomatology as subjective experiences from the 12 principal domains of organ system and psychophysiological functioning. This study investigates the factorial validity of the empirically constructed scales. (Author)

  16. Flat conductor cable survey

    NASA Technical Reports Server (NTRS)

    Swanson, C. R.; Walker, G. L.

    1973-01-01

    Design handbook contains data and illustrations concerned with commercial and Government flat-conductor-cable connecting and terminating hardware. Material was obtained from a NASA-sponsored industry-wide survey of approximately 150 companies and Government agencies.

  17. The path of placement of a removable partial denture: a microscope based approach to survey and design

    PubMed Central

    2015-01-01

    This article reviews the topic of how to identify and develop a removable partial denture (RPD) path of placement, and provides a literature review of the concept of the RPD path of placement, also known as the path of insertion. An optimal RPD path of placement, guided by mutually parallel guide planes, ensures that the RPD flanges fit intimately over edentulous ridge structures and that the framework fits intimately with guide plane surfaces, which prevents food collecting empty spaces between the intaglio surface of the framework and intraoral surfaces, and ensures that RPD clasps engage adequate numbers of tooth undercuts to ensure RPD retention. The article covers topics such as the causes of obstructions to RPD intra-oral seating, the causes of food collecting empty spaces that may exist around an RPD, and how to identify if a guide plane is parallel with the projected RPD path of placement. The article presents a method of using a surgical operating microscope, or high magnification (6-8x or greater) binocular surgical loupes telescopes, combined with co-axial illumination, to identify a preliminary path of placement for an arch. This preliminary path of placement concept may help to guide a dentist or a dental laboratory technician when surveying a master cast of the arch to develop an RPD path of placement, or in verifying that intra-oral contouring has aligned teeth surfaces optimally with the RPD path of placement. In dentistry, a well-fitting RPD reduces long-term periodontal or structural damage to abutment teeth. PMID:25722842

  18. The path of placement of a removable partial denture: a microscope based approach to survey and design.

    PubMed

    Mamoun, John Sami

    2015-02-01

    This article reviews the topic of how to identify and develop a removable partial denture (RPD) path of placement, and provides a literature review of the concept of the RPD path of placement, also known as the path of insertion. An optimal RPD path of placement, guided by mutually parallel guide planes, ensures that the RPD flanges fit intimately over edentulous ridge structures and that the framework fits intimately with guide plane surfaces, which prevents food collecting empty spaces between the intaglio surface of the framework and intraoral surfaces, and ensures that RPD clasps engage adequate numbers of tooth undercuts to ensure RPD retention. The article covers topics such as the causes of obstructions to RPD intra-oral seating, the causes of food collecting empty spaces that may exist around an RPD, and how to identify if a guide plane is parallel with the projected RPD path of placement. The article presents a method of using a surgical operating microscope, or high magnification (6-8x or greater) binocular surgical loupes telescopes, combined with co-axial illumination, to identify a preliminary path of placement for an arch. This preliminary path of placement concept may help to guide a dentist or a dental laboratory technician when surveying a master cast of the arch to develop an RPD path of placement, or in verifying that intra-oral contouring has aligned teeth surfaces optimally with the RPD path of placement. In dentistry, a well-fitting RPD reduces long-term periodontal or structural damage to abutment teeth. PMID:25722842

  19. DRAFT - Design of Radiological Survey and Sampling to Support Title Transfer or Lease of Property on the Department of Energy Oak Ridge Reservation

    SciTech Connect

    Cusick L.T.

    2002-09-25

    The U.S. Department of Energy (DOE) owns, operates, and manages the buildings and land areas on the Oak Ridge Reservation (ORR) in Oak Ridge, Tennessee. As land and buildings are declared excess or underutilized, it is the intent of DOE to either transfer the title of or lease suitable property to the Community Reuse Organization of East Tennessee (CROET) or other entities for public use. It is DOE's responsibility, in coordination with the U.S. Environmental Protection Agency (EPA), Region 4, and the Tennessee Department of Environment and Conservation (TDEC), to ensure that the land, facilities, and personal property that are to have the title transferred or are to be leased are suitable for public use. Release of personal property must also meet site requirements and be approved by the DOE contractor responsible for site radiological control. The terms title transfer and lease in this document have unique meanings. Title transfer will result in release of ownership without any restriction or further control by DOE. Under lease conditions, the government retains ownership of the property along with the responsibility to oversee property utilization. This includes involvement in the lessee's health, safety, and radiological control plans and conduct of site inspections. It may also entail lease restrictions, such as limiting access to certain areas or prohibiting digging, drilling, or disturbing material under surface coatings. Survey and sampling requirements are generally more rigorous for title transfer than for lease. Because of the accelerated clean up process, there is an increasing emphasis on title transfers of facilities and land. The purpose of this document is to describe the radiological survey and sampling protocols that are being used for assessing the radiological conditions and characteristics of building and land areas on the Oak Ridge Reservation that contain space potentially available for title transfer or lease. After necessary surveys and sampling and laboratory analyses are completed, the data are analyzed and included in an Environmental Baseline Summary (EBS) report for title transfer or in a Baseline Environmental Analysis Report (BEAR) for lease. The data from the BEAR is then used in a Screening-Level Human Health Risk Assessment (SHHRA) or a risk calculation (RC) to assess the potential risks to future owners/occupants. If title is to be transferred, release criteria in the form of specific activity concentrations called Derived Concentration Guideline Levels (DCGLs) will be developed for the each property. The DCGLs are based on the risk model and are used with the data in the EBS to determine, with statistical confidence, that the release criteria for the property have been met. The goal of the survey and sampling efforts is to (1) document the baseline conditions of the property (real or personal) prior to title transfer or lease, (2) obtain enough information that an evaluation of radiological risks can be made, and (3) collect sufftcient data so that areas that contain minimal residual levels of radioactivity can be identified and, following radiological control procedures, be released from radiological control. (It should be noted that release from radiological control does not necessarily mean free release because DOE may maintain institutional control of the site after it is released from radiological control). To meet the goals of this document, a Data Quality Objective (DQO) process will be used to enhance data collection efficiency and assist with decision-making. The steps of the DQO process involve stating the problem, identifying the decision, identifying inputs to the decision, developing study boundaries, developing the decision rule, and optimizing the design. This document describes the DQOs chosen for surveys and sampling efforts performed for the purposes listed above. The previous version to this document focused on the requirements for radiological survey and sampling protocols that are be used for leasing. Because the primary focus at this time is on title transfer, th

  20. The IRAC Shallow Survey

    E-print Network

    P. R. Eisenhardt; D. Stern; M. Brodwin; G. G. Fazio; G. H. Rieke; M. J. Rieke; M. W. Werner; E. L. Wright; L. E. Allen; R. G. Arendt; M. L. N. Ashby; P. Barmby; W. J. Forrest; J. L. Hora; J. -S. Huang; J. Huchra; M. A. Pahre; J. L. Pipher; W. T. Reach; H. A. Smith; J. R. Stauffer; Z. Wang; S. P. Willner; M. J. I. Brown; A. Dey; B. T. Jannuzi; G. P. Tiede

    2004-06-09

    The IRAC shallow survey covers 8.5 square degrees in the NOAO Deep Wide-Field Survey in Bootes with 3 or more 30 second exposures per position. An overview of the survey design, reduction, calibration, star-galaxy separation, and initial results is provided. The survey includes approximately 370,000, 280,000, 38,000, and 34,000 sources brighter than the 5 sigma limits of 6.4, 8.8, 51, and 50 microJy at 3.6, 4.5, 5.8, and 8 microns respectively, including some with unusual spectral energy distributions.

  1. Conduct a state-of-the-art survey of existing knowledge for the design of ground-source heat pumps

    NASA Astrophysics Data System (ADS)

    Ball, D. A.

    1982-03-01

    Horizontal serpentine coils have been and are at present the most common coil configuration. Best design data exist for horizontal coils in heating only applications with moist soil. Applications in dry soil or where significant summer cooling is required are not as well understood at this time. A seasonal performance factor of about 3.0 can be expected for a properly designed and installed residential ground-coupled heat-pump system. Long-term durability of buried steel and cooper tubing has been demonstrated. Life expectancy of thin-walled polyethylene tubing in the heating-only application is expected to be equally as good: however, present experience is limited to less than five years. In the cooling application with heat-rejection temperatures exceeding 100 F, some cracking has been experienced upon subsequent cool-down for heating operation due to localized stresses induced by conformity of the tubing to bedding material (stones) when hot. Receding of the soil from the pipe after a period of several years was experienced in the late 1940's. An understanding of this phenomenon may be crucial to the long-term operating success of these systems.

  2. Biological Survey of the Upper Purgatoire Watershed

    E-print Network

    Biological Survey of the Upper Purgatoire Watershed Las Animas County, CO John Carney Colorado....................................................................................................................14 Designate Target Inventory Areas (TIAs

  3. The Dark Energy Survey: Survey Strategy

    NASA Astrophysics Data System (ADS)

    Annis, James T.; Cunha, C.; Busha, M.; Ma, Z.; DES Collaboration

    2011-01-01

    The Dark Energy Survey uses 525 nights over 5 years of time on the CTIO Blanco 4m telescope to image 5000 sq-degrees of the South Galactic Cap in 5 filters while also performing a roughly 15 sq-degree time domain survey for supernovae. The survey strategy is designed to optimize three things: our ability to do cluster and LSS science early in the survey, our ability to do weak lensing, and our ability to collect an cosmologically interesting sample of supernovae. Thus we cover the entire survey area twice per year per bandpass (grizY, i > 24); we devote the best seeing time to the main survey; and we observe the SN fields at high cadence over 5-6 months while minimizing observation gaps of a week or more. We have chosen survey metrics which report survey area covered given a fiducial exposure time (the equivalent number of tilings), and an estimate of the total number of galaxies useful for weak lensing that is essentially a non-linear combination of signal to noise and seeing (the effective number of galaxies, n_eff). We have developed photo-z simulations given survey strategy parameters, and these along with the area covered, the depth achieved and n_eff allow us to estimate the figure of merit for our LSS, clusters, and weak lensing projects to judge the the ability of the each scenario to maximize the DETF FOM. We have developed a tool that uses the extensive weather and seeing data available for CTIO to simulate the survey. The poster will describe the current survey strategy and the results that support each choice.

  4. Development and validation of LC-MS/MS method for the quantitation of lenalidomide in human plasma using Box-Behnken experimental design.

    PubMed

    Hasnain, M Saquib; Rao, Shireen; Singh, Manoj Kr; Vig, Nitin; Gupta, Amit; Ansari, Abdulla; Sen, Pradeep; Joshi, Pankaj; Ansari, Shaukat Ali

    2013-03-01

    For the determination of lenalidomide using carbamazepine as an internal standard, an ultra-fast stability indicating LC-MS/MS method was developed, validated and optimized to support clinical advancement. The samples were prepared by solid-phase extraction. The calibration range was 2-1000 ng mL(-1), for which a quadratic regression (1/x(2)) was best fitted. The method was validated and a 3(2) factorial was employed using Box-Behnken experimental design for the validation of robustness. These designs have three factors such as mobile phase composition (X(1)), flow rate (X(2)) and pH (X(3)) while peak area (Y(1)) and retention time (Y(2)) were taken as response. This showed that little changes in mobile phase and flow rate affect the response while pH has no affect. Lenalidomide and carbamazepine were stable in human plasma after five freeze thaw cycles, at room temperature for 23.7 h, bench top stability for 6.4 h. This method competes with all the regulatory requirements for selectivity, sensitivity, precision, accuracy, and stability for the determination of lenalidomide in human plasma, as well as being highly sensitive and effective for the pharmacokinetic and bioequivalence study of lenalidomide. PMID:23323263

  5. NATIONAL MORTALITY FOLLOWBACK SURVEY (NMFS)

    EPA Science Inventory

    The 1993 National Mortality Followback Survey (NMFS) is the latest in a series of periodic surveys designed to supplement information routinely collected on the death certificate. The Mortality Followback Survey Program, begun in the 1960's by the National Center for Health Stati...

  6. New Student Survey, Fall 1998.

    ERIC Educational Resources Information Center

    Weglarz, Shirley

    The Fall 1998 annual survey of new Johnson County Community College (JCCC) students was designed to determine new students' educational objectives and what factors influence new students' decisions to attend JCCC. Surveys mailed to 3874 students identified by the Admissions Office resulted in 713 usable returned surveys. This evaluation reports…

  7. HPLC and HPTLC methods by design for quantitative characterization and in vitro anti-oxidant activity of polyherbal formulation containing Rheum emodi.

    PubMed

    Ahmad, Wasim; Zaidi, Syed Mohammad Arif; Mujeeb, Mohd; Ansari, Shahid Hussain; Ahmad, Sayeed

    2014-09-01

    Safoof-e-Pathar phori (SPP) is a traditional polyherbal formulation and has been used since long time for its anti-urolithiatic activity. It contains three plant constituents Didymocarpous pedicellata, Dolichous biflorus and Rheum emodi. Emodin and chrysophanic acid were selected as chemical markers for SPP and quantified using simultaneous HPTLC and RP-HPLC methods in R. emodi and in SPP. The simultaneous methods were found linear r(2) = 0.991 in a wide range (10-800 ng spot(-1) with HPTLC, 5-500 µg mL(-1) with HPLC) precise, accurate and robust for both the drugs. Anti-oxidant activity of SPP, R. emodi as well as standard emodin and chrysophanic acid were determined by using DPPH (2,2-diphenyl-1-picryl hydrazyl radical), which showed better activity of R. emodi (IC50 = 12.27) extract when compared with SPP (IC50 = 32.99) and standard drugs (IC50 = 66.81). The robustness of methods were proved by applying the Box-Behnken response surface design software and other validation parameters evaluated were satisfactorily met; hence, the developed method found suitable for application in the quality control of several formulations containing emodin and chrysophanic acid. PMID:23978770

  8. Design.

    ERIC Educational Resources Information Center

    Online-Offline, 1998

    1998-01-01

    Provides an annotated bibliography of resources on this month's theme "Design" for K-8 language arts, art and architecture, music and dance, science, math, social studies, health, and physical education. Includes Web sites, CD-ROMs and software, videos, books, audiotapes, magazines, professional resources and classroom activities. Features Art…

  9. Design for a multifrequency high magnetic field superconducting quantum interference device-detected quantitative electron paramagnetic resonance probe: Spin-lattice relaxation of cupric sulfate pentahydrate (CuSO4?5H2O)

    NASA Astrophysics Data System (ADS)

    Cage, Brant; Russek, Stephen

    2004-11-01

    We have designed a spectrometer for the quantitative determination of electron paramagnetic resonance (EPR) at high magnetic fields and frequencies. It uses a superconducting quantum interference device (SQUID) for measuring the magnetic moment as a function of the applied magnetic field and microwave frequency. We used powdered 2,2-diphenyl-1-picrylhydrazyl to demonstrate resolution of g-tensor anisotropy to 1 mT in a magnetic field of 3 T with a sensitivity of 1014 spins per 0.1 mT. We demonstrate multifrequency operation at 95 and 141 GHz. By use of an aligned single crystal of cupric sulfate pentahydrate (chalcanthite) CuSO4?5H2O, we show that the spectrometer is capable of EPR line shape analysis from 4 to 200 K with a satisfactory fit to a Lorentzian line shape at 100 K. Below 100 K, we observed line-broadening, g shifts, and spectral splittings, all consistent with a known low-dimensional phase transition. Using SQUID magnetometry and a superconducting magnet, we improve by an order of magnitude the sensitivity and magnetic field range of earlier power saturation studies of CuSO4?5H2O. We were able to saturate up to 70% of the magnetic moment with power transfer saturation studies at 95 GHz, 3.3 T, and 4 K and obtained the spin-lattice relaxation time, T1=1.8 ms, of CuSO4?5H2O at 3.3 T and 4 K. We found an inverse linear dependence of T1, in units of seconds (s) at 3.3 T between 4 and 2.3 K, such that T1=0.016?K?s??-1-0.0022?s, where ? is the absolute bath temperature. The quantitative determination of EPR is difficult with standard EPR techniques, especially at high frequencies or fields. Therefore this technique is of considerable value.

  10. Hypertext: An Introduction and Survey

    Microsoft Academic Search

    Jeff Conklin

    1987-01-01

    This article is a survey of existing hypertext systems, their applications, and their design. It is both an introduction to the world of hypertext and, at a deeper cut, a survey of some of the most important design issues that go into fashioning a hypertext environment. The concept of hypertext is quite simple: Windows on the screen are associated with

  11. Development and validation of a real-time two-step RT-qPCR TaqMan(®) assay for quantitation of Sacbrood virus (SBV) and its application to a field survey of symptomatic honey bee colonies.

    PubMed

    Blanchard, Philippe; Guillot, Sylvain; Antùnez, Karina; Köglberger, Hemma; Kryger, Per; de Miranda, Joachim R; Franco, Stéphanie; Chauzat, Marie-Pierre; Thiéry, Richard; Ribière, Magali

    2014-03-01

    Sacbrood virus (SBV) is the causal agent of a disease of honey bee larvae, resulting in failure to pupate and causing death. The typical clinical symptom of SBV is an accumulation of SBV-rich fluid in swollen sub-cuticular pouches, forming the characteristic fluid-filled sac that gives its name to the disease. Outbreaks of the disease have been reported in different countries, affecting the development of the brood and causing losses in honey bee colonies. Today, few data are available on the SBV viral load in the case of overt disease in larvae, or for the behavioural changes of SBV-infected adult bees. A two-step real-time RT-PCR assay, based on TaqMan(®) technology using a fluorescent probe (FAM-TAMRA) was therefore developed to quantify Sacbrood virus in larvae, pupae and adult bees from symptomatic apiaries. This assay was first validated according to the recent XP-U47-600 standard issued by the French Standards Institute, where the reliability and the repeatability of the results and the performance of the assay were confirmed. The performance of the qPCR assay was validated over the 6 log range of the standard curve (i.e. from 10(2) to 10(8) copies per well) with a measurement uncertainty evaluated at 0.11log10. The detection and quantitation limits were established respectively at 50 copies and 100 copies of SBV genome, for a template volume of 5?l of cDNA. The RT-qPCR assay was applied during a French SBV outbreak in 2012 where larvae with typical SBV signs were collected, along with individuals without clinical signs. The SBV quantitation revealed that, in symptomatic larvae, the virus load was significantly higher than in samples without clinical signs. Combining quantitation with clinical data, a threshold of SBV viral load related to an overt disease was proposed (10(10) SBV genome copies per individual). PMID:24121133

  12. Quantitative Review of a Political Science Documentary/Movie

    NSDL National Science Digital Library

    Tun Myint

    This assignment is designed to introduce quantitative reasoning and critical thinking in viewing documentary videos on the issues of development. Students will write a review essay about one of three designated documentaries for the course.

  13. Toward an Effective Design Process: Enhancing Building Performance through Better Integration of Facility Management Perspectives in the Design Process

    E-print Network

    Kalantari Hematabadi, Seyed Saleh

    2014-08-20

    and energy-use patterns anticipated by the building’s designers. The current research took a slightly different approach to this topic, by evaluating the outlooks and practices of facility managers (rather than occupants). In doing so, it helped 5.... In the quantitative material, survey results are presented and interpreted. Chapter VI provides a discussion of the research findings, and compares these results against outlooks given in the previous literature. Finally, Chapter VII is a conclusion that highlights...

  14. Montana State University 1 Land Surveying Minor

    E-print Network

    Maxwell, Bruce D.

    Montana State University 1 Land Surveying Minor This minor is designed to provide students with perspective and skills to pursue a successful career in surveying or a surveying related field. The focus is on courses related to surveying such as photogrammetry, global positioning systems

  15. Terminating Sequential Delphi Survey Data Collection

    ERIC Educational Resources Information Center

    Kalaian, Sema A.; Kasim, Rafa M.

    2012-01-01

    The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey

  16. "Suntelligence" Survey

    MedlinePLUS

    ... the American Academy of Dermatology's "Suntelligence" sun-smart survey. Please answer the following questions to measure your ... how you incorporate it into your life. The survey will take 5 to 7 minutes to complete. ...

  17. Design, synthesis, and quantitative structure-activity relationship study of herbicidal analogues of pyrazolo[5,1-d][1,2,3,5]tetrazin-4(3H)ones.

    PubMed

    Zhu, You-quan; Wu, Chao; Li, Hua-bin; Zou, Xiao-mao; Si, Xue-kai; Hu, Fang-zhong; Yang, Hua-zheng

    2007-02-21

    A series of pyrazolo[5,1-d][1,2,3,5]tetrazin-4(3H)one derivatives were designed, synthesized, and evaluated for their herbicidal activities where some of these compounds provided >80% control of Brassica campestris at 10 microg/mL. Quantitative structure-activity relationship studies were performed on these compounds using physicochemical parameters (electronic, Verloop, or hydrophobic) as independent parameters and herbicidal activity as a dependent parameter, where herbicidal activity correlated best (r > 0.8) with physicochemical parameters in this set of molecules. The herbicidal activity against B. campestris was mainly affected by the molar refractivity (MR) for R1, Taft (Eso) for R2 or R6, Verloop (Lm) for R3 or R5, and electronic parameters (Hammett's constants) for R4. The optimal MR for herbicidal activity is 0.95. The herbicidal activity against Echinochloa crus-galli was mainly related with the substituents' hydrophobic parameter. The optimal pi parameters for R1 and R4 for herbicidal activity are 0.72 and 0.68, respectively. In general, these compounds showed greater herbicidal activity toward B. campestris than E. crus-galli. PMID:17300154

  18. Online Interactive Teaching Modules Enhance Quantitative Proficiency of Introductory Biology Students

    PubMed Central

    Nelson, Kären C.; Marbach-Ad, Gili; Keller, Michael; Fagan, William F.

    2010-01-01

    There is widespread agreement within the scientific and education communities that undergraduate biology curricula fall short in providing students with the quantitative and interdisciplinary problem-solving skills they need to obtain a deep understanding of biological phenomena and be prepared fully to contribute to future scientific inquiry. MathBench Biology Modules were designed to address these needs through a series of interactive, Web-based modules that can be used to supplement existing course content across the biological sciences curriculum. The effect of the modules was assessed in an introductory biology course at the University of Maryland. Over the course of the semester, students showed significant increases in quantitative skills that were independent of previous math course work. Students also showed increased comfort with solving quantitative problems, whether or not they ultimately arrived at the correct answer. A survey of spring 2009 graduates indicated that those who had experienced MathBench in their course work had a greater appreciation for the role of mathematics in modern biology than those who had not used MathBench. MathBench modules allow students from diverse educational backgrounds to hone their quantitative skills, preparing them for more complex mathematical approaches in upper-division courses. PMID:20810959

  19. [Real time quantitative PCR].

    PubMed

    Kim, D W

    2001-04-21

    So far, quantitative techniques, such as PCR and FISH, have been used to detect of DNA and RNA. However, it is difficult to measure and compare the exact amount of amplified products with the results of endpoint analysis in conventional PCR techniques. Theoretically, there is a quantitative relationship between amount of starting target sequence and amount of PCR product at any given cycle. The development of real-time quantitative PCR (RQ-PCR) has eliminated the variability associated with conventional quantitative PCR, thus allowing the routine and reliable quantitation of PCR products. Detection of fluorescence during the thermal cycling process can be performed using iCycler(Bio-Rad), the GeneAmp 5700 or 7700(ABI-PRISM), and Light-Cycler(Roche). Two fluorogenic probes are available for use on real time quantitation. The fluorogenic 5'-nuclease assay(Taqman method) uses a fluorogenic probe to enable the detection of a sequence specific PCR product. Fluorogenic probe is incorporated with the reporter dye on the 5' end and the quencher on the 3' end. The second method uses SYBR Green I dye which is a highly specific double-stranded DNA binding dye. Real-time PCR is able to be possible exact quantitation of DNA and RNA much more precise and reproducible because it is based on CT values acquired during the exponential phase of PCR rather than endpoint. In this review, the detail protocol of real time quantitative PCR technique will be introduced and our recently developed system for exact quantitation of BCR-ABL fusion gene in CML is going to be described. PMID:11708318

  20. Accounting for Imperfect Detection in Ecology: A Quantitative Review

    PubMed Central

    Kellner, Kenneth F.; Swihart, Robert K.

    2014-01-01

    Detection in studies of species abundance and distribution is often imperfect. Assuming perfect detection introduces bias into estimation that can weaken inference upon which understanding and policy are based. Despite availability of numerous methods designed to address this assumption, many refereed papers in ecology fail to account for non-detection error. We conducted a quantitative literature review of 537 ecological articles to measure the degree to which studies of different taxa, at various scales, and over time have accounted for imperfect detection. Overall, just 23% of articles accounted for imperfect detection. The probability that an article incorporated imperfect detection increased with time and varied among taxa studied; studies of vertebrates were more likely to incorporate imperfect detection. Among articles that reported detection probability, 70% contained per-survey estimates of detection that were less than 0.5. For articles in which constancy of detection was tested, 86% reported significant variation. We hope that our findings prompt more ecologists to consider carefully the detection process when designing studies and analyzing results, especially for sub-disciplines where incorporation of imperfect detection in study design and analysis so far has been lacking. PMID:25356904

  1. Towards global benchmarking of food environments and policies to reduce obesity and diet-related non-communicable diseases: design and methods for nation-wide surveys

    PubMed Central

    Vandevijvere, Stefanie; Swinburn, Boyd

    2014-01-01

    Introduction Unhealthy diets are heavily driven by unhealthy food environments. The International Network for Food and Obesity/non-communicable diseases (NCDs) Research, Monitoring and Action Support (INFORMAS) has been established to reduce obesity, NCDs and their related inequalities globally. This paper describes the design and methods of the first-ever, comprehensive national survey on the healthiness of food environments and the public and private sector policies influencing them, as a first step towards global monitoring of food environments and policies. Methods and analysis A package of 11 substudies has been identified: (1) food composition, labelling and promotion on food packages; (2) food prices, shelf space and placement of foods in different outlets (mainly supermarkets); (3) food provision in schools/early childhood education (ECE) services and outdoor food promotion around schools/ECE services; (4) density of and proximity to food outlets in communities; food promotion to children via (5) television, (6) magazines, (7) sport club sponsorships, and (8) internet and social media; (9) analysis of the impact of trade and investment agreements on food environments; (10) government policies and actions; and (11) private sector actions and practices. For the substudies on food prices, provision, promotion and retail, ‘environmental equity’ indicators have been developed to check progress towards reducing diet-related health inequalities. Indicators for these modules will be assessed by tertiles of area deprivation index or school deciles. International ‘best practice benchmarks’ will be identified, against which to compare progress of countries on improving the healthiness of their food environments and policies. Dissemination This research is highly original due to the very ‘upstream’ approach being taken and its direct policy relevance. The detailed protocols will be offered to and adapted for countries of varying size and income in order to establish INFORMAS globally as a new monitoring initiative to reduce obesity and diet-related NCDs. PMID:24833697

  2. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  3. Automated Quantitative Software Verification

    E-print Network

    Oxford, University of

    quantitative properties such as "the worst-case probability that the airbag fails to deploy within 10ms", instead of qualitative properties such as "the airbag eventually deploys". Although many model checking

  4. On Quantitative Rorschach Scales.

    ERIC Educational Resources Information Center

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  5. Quantitative PCR Protocol

    NSDL National Science Digital Library

    The Jackson Laboratory (The Jackson Laboratory)

    2012-01-06

    This protocol describes how to genotype mice using Quantitative Polymerase Chain Reaction (PCR). The protocol focuses specifically on Ts65Dn mice, but can be used as a basis for genotyping ohter strains.

  6. HEDGEROW SURVEY, GREAT CRESTED NEWT SURVEY, DORMOUSE SURVEY AND HORSESHOE BAT ACTIVITY SURVEYS AT UNIVERSITY OF

    E-print Network

    Burton, Geoffrey R.

    HEDGEROW SURVEY, GREAT CRESTED NEWT SURVEY, DORMOUSE SURVEY AND HORSESHOE BAT ACTIVITY SURVEYS-UNIBAT-1624 HEDGEROW SURVEY, GREAT CRESTED NEWT SURVEY, DORMOUSE SURVEY AND HORSESHOE BAT ACTIVITY SURVEYS to undertake a hedgerow survey, a great crested newt survey, a dormouse survey and horseshoe bat activity

  7. Quantitative receptor autoradiography

    SciTech Connect

    Boast, C.A.; Snowhill, E.W.; Altar, C.A.

    1986-01-01

    Quantitative receptor autoradiography addresses the topic of technical and scientific advances in the sphere of quantitative autoradiography. The volume opens with a overview of the field from a historical and critical perspective. Following is a detailed discussion of in vitro data obtained from a variety of neurotransmitter systems. The next section explores applications of autoradiography, and the final two chapters consider experimental models. Methodological considerations are emphasized, including the use of computers for image analysis.

  8. Use of Web and Phone Survey Modes to Gather Data From Adults About Their Young Adult Children: An Evaluation Based on a Randomized Design

    PubMed Central

    Fleming, Charles B.; Marchesini, Gina; Elgin, Jenna; Haggerty, Kevin P.; Woodward, Danielle; Abbott, Robert D.; Catalano, Richard F.

    2013-01-01

    Mode effects on responses to survey items may introduce bias to data collected using multiple modes of administration. The present study examines data from 704 surveys conducted as part of a longitudinal study in which parents and their children had been surveyed at multiple prior time points. Parents of 22-year-old study participants were randomly assigned to one of two mixed-mode conditions: (a) Web mode first followed by the offer of an interviewer-administered telephone mode; or (b) telephone mode first followed by the offer of the Web mode. Comparison of responses by assigned condition on 12 measures showed one statistically significant difference. Analyses that modeled differences by completed mode and the interaction between assigned condition and completed mode found significant differences on six measures related to completed mode. None of the differences indicated that more socially desirable responses were given in interviewer-administered surveys. PMID:24733977

  9. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works. Engineers need more quantitative information. In order to apply geophysical methods to engineering design works, quantitative interpretation is very important. The presentation introduces several case studies from different countries around the world (Fig. 2) from the integrated and quantitative points of view.

  10. Extreme Ice Survey

    NASA Astrophysics Data System (ADS)

    Balog, J. D.

    2008-12-01

    The Extreme Ice Survey (EIS) is a multi-disciplinary visual documentation of glacial deflation and retreat in Greenland, Iceland, Alaska, the Rocky Mountains, Mexico, Bolivia and the Alps. Since spring 2007, 27 time- lapse cameras record 15 glaciers once an hour for every hour of daylight; documentation will continue until autumn 2009, with 400,000 frames produced. The images are compiled into video animations providing scientists with a vital look at glacial dynamics and the general public with real-world evidence of the changing Earth. EIS sets the stage for quantitative discussion of glacial retreat at the AGU annual meeting.

  11. Quick statistics Survey 55563 'Privacy Survey'

    E-print Network

    Kaiser, Gail E.

    Quick statistics Survey 55563 'Privacy Survey' Results Survey 55563 Number of records in this query: 277 Total records in survey: 277 Percentage of total: 100.00% Page 1 / 94 #12;Quick statistics Survey 55563 'Privacy Survey' Page 2 / 94 #12;Quick statistics Survey 55563 'Privacy Survey' Field summary

  12. 3D-quantitative structure-activity relationships of human immunodeficiency virus type-1 proteinase inhibitors: comparative molecular field analysis of 2-heterosubstituted statine derivatives-implications for the design of novel inhibitors.

    PubMed

    Kroemer, R T; Ettmayer, P; Hecht, P

    1995-12-01

    A set of 100 novel 2-heterosubstituted statine derivatives inhibiting human immunodeficiency virus type-1 proteinase has been investigated by comparative molecular field analysis. In order to combine the structural information available from X-ray analyses with a predictive quantitative structure-activity relationship (QSAR) model, docking experiments of a prototype compound into the receptor were performed, and the 'active conformation' was determined. The structure of the receptor was taken from the published X-ray analysis of the proteinase with bound MVT-101, the latter compound exhibiting high structural similarity with the inhibitors investigated. The validity of the resulting QSARs was confirmed in four different ways. (1) The common parameters, namely, the cross-validated r2 values obtained by the leave-one-out (LOO) method (r2cv = 0.572-0.593), and (2) the accurate prediction of a test set of 67 compounds (q2 = 0.552-0.569) indicated a high consistency of the models. (3) Repeated analyses with two randomly selected cross-validation groups were performed and the cross-validated r2 values monitored. The resulting average r2 values were of similar magnitudes compared to those obtained by the LOO method. (4) The coefficient fields were compared with the steric and electrostatic properties of the receptor and showed a high level of compatibility. Further analysis of the results led to the design of a novel class of highly active compounds containing an additional linkage between P1' and P3'. The predicted activities of these inhibitors were also in good agreement with the experimentally determined values. PMID:8523405

  13. Hydrographic surveys

    USGS Publications Warehouse

    1896-01-01

    This circular is intended to answer questions asked by correspondents regarding the progress and character of the work of the "Irrigation Survey" and of related investigations being carried on by the Division of Hydrography of the United States Geological Survey. It also gives a review of the legislation authorizing this work, together with a list of publications of the Geological Survey showing the results accomplished.

  14. Image analysis combined with quantitative cytochemistry

    Microsoft Academic Search

    J. S. Ploem; A. M. J. Driel-Kulker; L. Goyarts-Veldstra; J. J. Ploem-Zaaijer; N. P. Verwoerd; M. Zwan

    1986-01-01

    This paper describes the application of image analysis combined with a quantitative staining method for the analysis of cervical specimens. The image analysis is carried out with the Leyden Television Analysis System, LEYTAS, of which two versions are described. LEYTAS-1 as well as LEYTAS-2 have both been designed with a high degree of flexibility and interaction facilities. A much wider

  15. AzTEC COSMOS Survey

    NASA Astrophysics Data System (ADS)

    Yun, Min Su; Ade, P. A.; Aretxaga, I.; Austermann, J.; Bock, J. J.; Hughes, D.; Kang, Y.; Kim, S.; Lowenthal, J.; Mauskopf, P.; Scott, K.; Wilson, G.

    2006-12-01

    The Cosmic Evolution Survey (COSMOS) is a 2 square degree HST/ACS survey specifically designed to probe galaxy evolution as a function of time and environment (PI: N. Scoville). In addition to the extensive HST data, the COSMOS team has acquired deep multi-wavelength data from radio to X-ray (VLA, Spitzer, NOAO, CFHT, Subaru, Galex, Chandra, XMM). Spectroscopic surveys are currently under way using Magellan, Kecks, and VLT, and an extensive photometric redshift database is also being assembled. Future surveys using major new instruments such as Herschel are also being planned. To take advantage of these rich complementary databases, we have undertaken a 1100 micron imaging survey of a 30' x 30' field centered just north of the earlier mm/submm surveys by the Bolocam on CSO and MAMBO on the 30-m telescope, with a small overlap. We will present some of the preliminary results from the survey.

  16. The Equity Survey.

    ERIC Educational Resources Information Center

    Council of Ontario Universities, Toronto.

    In August 2001, applicants to Ontario, Canada, universities received a survey from the Council of Ontario Universities to better understand the socioeconomic status and representation of members of recognized designated groups (Aboriginal Peoples of Canada, members of visible minority groups, people with disabilities, and women) in the university…

  17. NORWEB headquarters environmental survey

    Microsoft Academic Search

    Don Dickson

    1991-01-01

    This research study conducted by Don Dickson, ERDC, was part of a larger programme into the causes of building sickness and the results of the survey showed that the environmental conditions were exactly as designed, with close control achieved by good operation and maintenance ? the building compared favourably with the best naturally ventilated buildings.

  18. National Health Survey

    NSDL National Science Digital Library

    Australian Bureau of Statistics

    The survey was designed to obtain national benchmarks on a wide range of health issues, and to enable changes in health to be monitored over time. Information was collected about: the health status of the population; health-related aspects of lifestyle and other health risk factors; and the use of health services and other actions people had recently taken for their health."

  19. NATIONAL ALCOHOL SURVEY (NAS)

    EPA Science Inventory

    National Alcohol Survey (NAS) is designed to assess the trends in drinking practices and problems in the national population, including attitudes, norms, treatment and experiences and adverse consequences. It also studies the effects of public policy on drinking practices (i.e., ...

  20. A Survey Transition Course

    ERIC Educational Resources Information Center

    Johnston, William; McAllister, Alex M.

    2012-01-01

    Successful outcomes for a "Transition Course in Mathematics" have resulted from two unique design features. The first is to run the course as a "survey course" in mathematics, introducing sophomore-level students to a broad set of mathematical fields. In this single mathematics course, undergraduates benefit from an introduction of proof…

  1. Surveying the Therapeutic Landscape

    Microsoft Academic Search

    Jean Stephans Kavanagh; Thomas A. Musiak

    1994-01-01

    This paper is an initial report on the nationwide survey of outdoor facilities of horticultural therapy programs conducted from Texas Tech University. This report is intended to encourage discussions which explore areas of future research into the optimum physical design of outdoor plant-oriented therapeutic landscapes. The premise for these discussions is found in the perception of therapeutic landscapes therapy as

  2. The Hydrographic Survey Meta Database

    Microsoft Academic Search

    D. E. Neumann

    2008-01-01

    In 2005, the design and future considerations of this hydrographic survey metadata data base (HSMDB) were presented at the Marine Technology Society Proceedings. This paper will provide an update. The HSMDB now offers a user interface to previously unavailable hydrographic survey data, data products and metadata conveniently and freely over the Internet. The product descriptive report images alone have had

  3. Remaining Open to Quantitative, Qualitative, and Mixed?Method Designs: An Unscientific Compromise, or Good Research Practice? 1 1 Author note: This paper is based on the Doctoral Research of Keith R. McVilly, which was recognized with Australian Psychological Society's 2005 Thesis Award for a thesis in the field of human relationships. The research was partly funded by an Australian Post Graduate Award, in the Faculty of Medicine, University of Sydney

    Microsoft Academic Search

    Keith R. Mcvilly; Roger J. Stancliffe; Trevor R. Parmenter

    2008-01-01

    The tension between quantitative and qualitative research paradigms are discussed together with the important contribution of mixed?method designs, particularly as they are applied in the field of disability studies. Practical issues inherent in research designs involving participants with intellectual disability are explored, including sample building, participant consent, data collection and data analysis. It is concluded, scientific debate needs to move

  4. Eastern Lake Survey: Phase 2 and National Stream Survey. Phase 1. Processing laboratory operations report

    Microsoft Academic Search

    L. J. Arent; M. O. Morison; C. S. Soong

    1989-01-01

    The National Surface Water Survey was designed to characterize surface water chemistry in regions of the United States believed to be potentially sensitive to acidic deposition. The National Stream Survey was a synoptic survey designed to quantify the chemistry of streams in the areas of the United States known to contain low alkalinity waters. Phase II of the Eastern Lake

  5. Development of a survey to assess adolescent perceptions of teen parenting.

    PubMed

    Herrman, Judith W; Nandakumar, Ratna

    2012-01-01

    Initiatives designed to prevent teen pregnancy are often based on adult perceptions of the negative aspects of a teen birth. Qualitative research has revealed that teens may perceive positive rewards associated with teen parenting. These perceptions have not yet been examined through survey research. The theory of reasoned action proposes that individuals assess the costs and rewards prior to engaging in a behavior and provides a framework for the development of a survey instrument designed to measure adolescent thoughts about the costs and rewards of the teen parenting experience. This manuscript describes the development and testing of a quantitative survey instrument designed to measure adolescents' perceptions. Pretesting, piloting, exploratory factor analysis, and a variety of reliability and validity measures were used to determine the value of the measure. The thoughts on teen parenting survey (TTPS) demonstrates an alpha level of .90. The TTPS yields a cumulative score of teen perceptions about the impact of a teen birth during the adolescent years that may be used to assess youth beliefs, correlated with demographic data, used to identify teens at risk for pregnancy/parenting, or provide a pretest/posttest to assess the effectiveness of interventions designed to foster realistic attitudes toward teen parenting. PMID:22679706

  6. Quantitative Plant Phosphoproteomics

    PubMed Central

    Kline, Kelli G.; Barrett-Wilt, Gregory A.; Sussman, Michael R.

    2011-01-01

    Protein phosphorylation is a major post-translational modification in plants crucial for the regulation of diverse cellular functions. In the early stages of this field, efforts focused on the qualitative detection, identification, and cataloging of in vivo protein phosphorylation sites. Recently these studies have advanced into utilizing quantitative mass spectrometric measurements, capable of dynamically monitoring changes in phosphorylation levels in response to genetic and environmental alterations. This review will highlight current untargeted and targeted mass spectral technologies used for quantitative phosphoproteome measurements in plants, and provide a discussion of these phosphorylation changes in relation to important biological events. PMID:21764629

  7. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  8. Seismic Survey

    USGS Multimedia Gallery

    USGS hydrologists conduct a seismic survey in New Orleans, Louisiana. The survey was one of several geophysical methods used during USGS applied research on the utility of the multi-channel analysis of surface waves (MASW) seismic method (no pictured here) for non-invasive assessment of earthen leve...

  9. Electromagnetic Survey

    USGS Multimedia Gallery

    USGS hydrologist conducts a broadband electromagnetic survey in New Orleans, Louisiana. The survey was one of several geophysical methods used during USGS applied research on the utility of the multi-channel analysis of surface waves (MASW) seismic method for non-invasive assessment of earthen levee...

  10. Leisure and other data from the Canadian labour force survey add?on surveys: Bias concerns

    Microsoft Academic Search

    Jay Beaman; Lori B. Shelby

    2007-01-01

    This article provides intuitive and quantitative insights into biased estimates (e.g., means) being a problem when using Canadian Labour Force Survey (LFS) add?on survey data. This matters because LFS add?on surveys include key sources of leisure related data and Canada's “Data Liberation Initiative” is encouraging their use. The magnitude of bias occurring is illustrated by finding a 16% reduction in

  11. A new quantitative assessment tool for computer science programs

    Microsoft Academic Search

    Timothy V. Fossum; Susan M. Haller

    2005-01-01

    We have designed a quantitative measure using card sorts that we show is statistically significant in distinguishing beginning students (novices) from those who have acquired competency appropriate to graduates of computer science (CS) programs. Using card sorts and applying this quantitative analysis, CS departments can arm themselves with another measure of the effectiveness of their academic programs in achieving their

  12. Quantitative Articles: Developing Studies for Publication in Counseling Journals

    ERIC Educational Resources Information Center

    Trusty, Jerry

    2011-01-01

    This article is presented as a guide for developing quantitative studies and preparing quantitative manuscripts for publication in counseling journals. It is intended as an aid for aspiring authors in conceptualizing studies and formulating valid research designs. Material is presented on choosing variables and measures and on selecting…

  13. Meaning in Method: The Rhetoric of Quantitative and Qualitative Research

    Microsoft Academic Search

    WILLIAM A. FIRESTONE

    1987-01-01

    The current debate about quantitative and qualitative methods focuses on whether there is a necessary connection between method-type and research paradigm that makes the different approaches incompatible. This paper argues that part of the connection is rhetorical. Quantitative methods express the assumptions of a positvisit paradigm which holds that behavior can be explained through objective facts. Design and instrumentation persuade

  14. Blending Qualitative & Quantitative Research Methods in Theses and Dissertations.

    ERIC Educational Resources Information Center

    Thomas, R. Murray

    This guide discusses combining qualitative and quantitative research methods in theses and dissertations. It covers a wide array of methods, the strengths and limitations of each, and how they can be effectively interwoven into various research designs. The first chapter is "The Qualitative and the Quantitative." Part 1, "A Catalogue of…

  15. Eastern lake survey: Phase 2 and national stream survey-phase 1 processing laboratory operations report (project summary)

    Microsoft Academic Search

    L. J. Arent; M. O. Morison; C. S. Soong

    1990-01-01

    The National Surface Water Survey was designed to characterize surface water chemistry in regions of the United States believed to be potentially sensitive to acidic deposition. The National Stream Survey was a synoptic survey designed to quantify the chemistry of streams in areas of the United States known to contain low alkalinity waters. Phase II of the Eastern Lake Survey

  16. Sky Surveys

    NASA Astrophysics Data System (ADS)

    Djorgovski, S. George; Mahabal, Ashish; Drake, Andrew; Graham, Matthew; Donalek, Ciro

    Sky surveys represent a fundamental data basis for astronomy. We usethem to map in a systematic way the universe and its constituents andto discover new types of objects or phenomena. We review the subject,with an emphasis on the wide-field, imaging surveys, placing them ina broader scientific and historical context. Surveys are now the largestdata generators in astronomy, propelled by the advances in informationand computation technology, and have transformed the ways in whichastronomy is done. This trend is bound to continue, especially with thenew generation of synoptic sky surveys that cover wide areas of the skyrepeatedly and open a new time domain of discovery. We describe thevariety and the general properties of surveys, illustrated by a number ofexamples, the ways in which they may be quantified and compared, andoffer some figures of merit that can be used to compare their scientificdiscovery potential. Surveys enable a very wide range of science, and that isperhaps their key unifying characteristic. As new domains of the observableparameter space open up thanks to the advances in technology, surveys areoften the initial step in their exploration. Some science can be done withthe survey data alone (or a combination of data from different surveys),and some require a targeted follow-up of potentially interesting sourcesselected from surveys. Surveys can be used to generate large, statisticalsamples of objects that can be studied as populations or as tracers of largerstructures to which they belong. They can be also used to discover orgenerate samples of rare or unusual objects and may lead to discoveriesof some previously unknown types. We discuss a general framework ofparameter spaces that can be used for an assessment and comparison ofdifferent surveys and the strategies for their scientific exploration. As we aremoving into the Petascale regime and beyond, an effective processing andscientific exploitation of such large data sets and data streams pose manychallenges, some of which are specific to any given survey and some ofwhich may be addressed in the framework of Virtual Observatory andAstroinformatics. The exponential growth of data volumes and complexity makesa broader application of data mining and knowledge discovery technologiescritical in order to take a full advantage of this wealth of information.Finally, we discuss some outstanding challenges and prospects for thefuture.

  17. Quantitative tritium imaging

    Microsoft Academic Search

    Ian Stuart Youle

    1999-01-01

    Tritium Imaging electrostatically focuses secondary electrons produced at a surface by beta-particles from tritium in the material form an image of the tritiated areas. It has hitherto been essentially a qualitative technique. The research described here examines quantitative aspects of the process. Of particular importance is the effect of depth of tritium on image intensity. For imaging purposes, tritium must

  18. Design, Implementation and Multisite Evaluation of a System Suitability Protocol for the Quantitative Assessment of Instrument Performance in Liquid Chromatography-Multiple Reaction Monitoring-MS (LC-MRM-MS)*

    PubMed Central

    Abbatiello, Susan E.; Mani, D. R.; Schilling, Birgit; MacLean, Brendan; Zimmerman, Lisa J.; Feng, Xingdong; Cusack, Michael P.; Sedransk, Nell; Hall, Steven C.; Addona, Terri; Allen, Simon; Dodder, Nathan G.; Ghosh, Mousumi; Held, Jason M.; Hedrick, Victoria; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kim, Jong Won; Lyssand, John S.; Riley, C. Paige; Rudnick, Paul; Sadowski, Pawel; Shaddox, Kent; Smith, Derek; Tomazela, Daniela; Wahlander, Asa; Waldemarson, Sofia; Whitwell, Corbin A.; You, Jinsam; Zhang, Shucha; Kinsinger, Christopher R.; Mesri, Mehdi; Rodriguez, Henry; Borchers, Christoph H.; Buck, Charles; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel; MacCoss, Michael; Neubert, Thomas A.; Paulovich, Amanda; Regnier, Fred; Skates, Steven J.; Tempst, Paul; Wang, Mu; Carr, Steven A.

    2013-01-01

    Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities. PMID:23689285

  19. The Identification and Description of Critical Thinking Behaviors in the Practice of Clinical Laboratory Science, Part 1: Design, Implementation, Evaluation, and Results of a National Survey.

    ERIC Educational Resources Information Center

    Kenimer, Elizabeth A.

    2002-01-01

    A survey of 1,562 clinical laboratory scientists ranked critical thinking behaviors used in practice. Important behaviors were cognitive, behavioral, affective, and situated/contextual. Findings support a view of critical thinking as a metaprocess that spans learning domains. (Contains 17 references.) (SK)

  20. A survey of computer science teacher preparation programs in Israel tells us: computer science deserves a designated high school teacher preparation

    Microsoft Academic Search

    Noa Ragonis; Orit Hazzan; Judith Gal-Ezer

    2010-01-01

    This paper focuses on the development and implementation of computer science (CS) teacher preparation programs, which are among the educational and pedagogical challenges faced by those involved in the current development of CS. It presents a survey that reflects the accumulative knowledge gained in Israel over the past twenty years with respect to CS teacher preparation. We explored nine institutes

  1. Quantitative Activities for Introductory Astronomy

    NASA Astrophysics Data System (ADS)

    Keohane, Jonathan W.; Bartlett, J. L.; Foy, J. P.

    2010-01-01

    We present a collection of short lecture-tutorial (or homework) activities, designed to be both quantitative and accessible to the introductory astronomy student. Each of these involves interpreting some real data, solving a problem using ratios and proportionalities, and making a conclusion based on the calculation. Selected titles include: "The Mass of Neptune” "The Temperature on Titan” "Rocks in the Early Solar System” "Comets Hitting Planets” "Ages of Meteorites” "How Flat are Saturn's Rings?” "Tides of the Sun and Moon on the Earth” "The Gliese 581 Solar System"; "Buckets in the Rain” "How Hot, Bright and Big is Betelgeuse?” "Bombs and the Sun” "What Forms Stars?” "Lifetimes of Cars and Stars” "The Mass of the Milky” "How Old is the Universe?” "Is The Universe Speeding up or Slowing Down?"

  2. RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS

    PubMed Central

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2014-01-01

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed light on a number of aspects of neuroscience that relates to normal brain function as well as of the changes in protein expression and regulation that occurs in neuropsychiatric and neurodegenerative disorders. PMID:23623823

  3. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  4. DEVELOPING NEW QUALITY INDICATORS IN SOCIAL SURVEYS

    Microsoft Academic Search

    Lucy Haselden; Amanda White

    Rather than rely on traditional measures of survey quality such as response rates, Social Survey Division of the UK Office for National Statistics, has been looking for alternative ways to report this. In order to achieve this we have mapped out all the processes involved throughout the lifetime of a survey, from sampling and questionnaire design through to producing a

  5. State of the Field Survey, 2006

    ERIC Educational Resources Information Center

    Forum on Education Abroad, 2006

    2006-01-01

    In 2006, the Forum on Education Abroad conducted a State of the Field Survey of its membership. This survey is meant to be the first of an annual assessment of what is on the minds of Forum members and, by extension, the field of education abroad in general. The 2006 survey was developed and designed by the Forum Data Committee with input form the…

  6. survey paper

    E-print Network

    2012-03-09

    In Chapter 3, we will survey the remarkable recent isoperimetric ... marks and corrections during the school and to all the participants for their interest in this course. ...... After various preliminary results [Fel], [De]..., the first main idea in the study.

  7. Surveying System

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Sunrise Geodetic Surveys are setting up their equipment for a town survey. Their equipment differs from conventional surveying systems that employ transit rod and chain to measure angles and distances. They are using ISTAC Inc.'s Model 2002 positioning system, which offers fast accurate surveying with exceptional signals from orbiting satellites. The special utility of the ISTAC Model 2002 is that it can provide positioning of the highest accuracy from Navstar PPS signals because it requires no knowledge of secret codes. It operates by comparing the frequency and time phase of a Navstar signal arriving at one ISTAC receiver with the reception of the same set of signals by another receiver. Data is computer processed and translated into three dimensional position data - latitude, longitude and elevation.

  8. Nature as the Most Important Coping Strategy Among Cancer Patients: A Swedish Survey.

    PubMed

    Ahmadi, Fereshteh; Ahmadi, Nader

    2015-08-01

    The authors have conducted a quantitative survey to examine the extent to which the results obtained in a qualitative study among cancer patients in Sweden (Ahmadi, Culture, religion and spirituality in coping: The example of cancer patients in Sweden, Uppsala, Acta Universitatis Upsaliensis, 2006) are applicable to a wider population of cancer patients in this country. In addition to questions relating to the former qualitative study, this survey also references the RCOPE questionnaire (designed by Kenneth I Pargament) in the design of the new quantitative study. In this study, questionnaires were distributed among persons diagnosed with cancer; 2,355 people responded. The results show that nature has been the most important coping method among cancer patients in Sweden. The highest mean value (2.9) is the factor 'nature has been an important resource to you so that you could deal with your illnesses'. Two out of three respondents (68 %) affirm that this method helped them feel significantly better during or after illness. The second highest average (2.8) is the factor 'listening to 'natural music' (birdsong and the wind)'. Two out of three respondents (66 %) answered that this coping method significantly helped them feel better during illness. The third highest average (2.7) is the factor 'to walk or engage in any activity outdoors gives you a spiritual sense'. This survey concerning the role of nature as the most important coping method for cancer patients confirms the result obtained from the previous qualitative studies. PMID:24363200

  9. Energy & Climate: Getting Quantitative

    NASA Astrophysics Data System (ADS)

    Wolfson, Richard

    2011-11-01

    A noted environmentalist claims that buying an SUV instead of a regular car is energetically equivalent to leaving your refrigerator door open for seven years. A fossil-fuel apologist argues that solar energy is a pie-in-the-sky dream promulgated by na"ive environmentalists, because there's nowhere near enough solar energy to meet humankind's energy demand. A group advocating shutdown of the Vermont Yankee nuclear plant claims that 70% of its electrical energy is lost in transmission lines. Around the world, thousands agitate for climate action, under the numerical banner ``350.'' Neither the environmentalist, the fossil-fuel apologist, the antinuclear activists, nor most of those marching under the ``350'' banner can back up their assertions with quantitative arguments. Yet questions about energy and its environmental impacts almost always require quantitative answers. Physics can help! This poster gives some cogent examples, based on the newly published 2^nd edition of the author's textbook Energy, Environment, and Climate.

  10. Macropinosome quantitation assay

    PubMed Central

    Wang, Jack T.H.; Teasdale, Rohan D.; Liebl, David

    2014-01-01

    In contrast to phagocytosis, macropinocytosis is not directly initiated by interactions between cell surface receptors and cargo ligands, but is a result of constitutive membrane ruffling driven by dynamic remodelling of cortical actin cytoskeleton in response to stimulation of growth factor receptors. Wang et al. (2010) [13] developed a reliable assay that allows quantitative assessment of the efficiency and kinetics of macropinosome biogenesis and/or maturation in cells where the function of a targeted protein has been perturbed by pharmacological inhibitors or by knock-down or knock-out approaches. In this manuscript we describe a modified quantitative protocol to measure the rate and volume of fluid phase uptake in adherent cells. This assay:•uses fluorescent dextran, microscopy and semi-automated image analysis;•allows quantitation of macropinosomes within large numbers of individual cells;•can be applied also to non-homogenous cell populations including transiently transfected cell monolayers. We present the background necessary to consider when customising this protocol for application to new cell types or experimental variations.

  11. Quantitation of mitral regurgitation.

    PubMed

    Topilsky, Yan; Grigioni, Francesco; Enriquez-Sarano, Maurice

    2011-01-01

    Mitral regurgitation (MR) is the most frequent valve disease. Nevertheless, evaluation of MR severity is difficult because standard color flow imaging is plagued by considerable pitfalls. Modern surgical indications in asymptomatic patients require precise assessment of MR severity. MR severity assessment is always comprehensive, utilizing all views and methods. Determining trivial/mild MR is usually easy, based on small jet and flow convergence. Specific signs of severe MR (pulmonary venous flow systolic reversal or severe mitral lesion) are useful but insensitive. Quantitative methods, quantitative Doppler (measuring stroke volumes) and flow convergence (aka PISA method), measure the lesion severity as effective regurgitant orifice (ERO) and volume overload as regurgitant volume (RVol). Interpretation of these numbers should be performed in context of specific MR type. In organic MR (intrinsic valve lesions) ERO ? 0.40 cm(2) and RVol ? 60 mL are associated with poor outcome, while in functional MR ERO ? 0.20 cm(2) and RVol ? 30 mL mark reduced survival. While MR assessment should always be comprehensive, quantitative assessment of MR provides measures that are strongly predictive of outcome and should be the preferred approach. The ERO and RVol measured by these methods require interpretation in causal context to best predict outcome and determine MR management. PMID:22041039

  12. ICF-Based Disability Survey in a Rural Population of Adults and Older Adults Living in Cinco Villas, Northeastern Spain: Design, Methods and Population Characteristics

    Microsoft Academic Search

    Jesús de Pedro-Cuesta; Magdalena Comín Comín; Javier Virués-Ortega; Javier Almazán Isla; Fuencisla Avellanal; Enrique Alcalde Cabero; Olga Burzaco; Juan Manuel Castellote; Alarcos Cieza; Javier Damián; Maria João Forjaz; Belén Frades; Esther Franco; Luis Alberto Larrosa; Rosa Magallón; Gloria Martín García; Cristina Martínez; Pablo Martínez Martín; Roberto Pastor-Barriuso; Ana Peña Jiménez; Adolfo Población Martínez; Geoffrey Reed; Cristina Ruíz

    2010-01-01

    Background: This article describes the methods of a door-to-door screening survey exploring the distribution of disability and its major determinants in northeastern Spain. This study will set the basis for the development of disability-related services for the rural elderly in northeastern Spain. Methods: The probabilistic sample was composed of 1,354 de facto residents from a population of 12,784 Social Security

  13. Robotic Surveying

    SciTech Connect

    Suzy Cantor-McKinney; Michael Kruzic

    2007-03-01

    ZAPATA ENGINEERING challenged our engineers and scientists, which included robotics expertise from Carnegie Mellon University, to design a solution to meet our client's requirements for rapid digital geophysical and radiological data collection of a munitions test range with no down-range personnel. A prime concern of the project was to minimize exposure of personnel to unexploded ordnance and radiation. The field season was limited by extreme heat, cold and snow. Geographical Information System (GIS) tools were used throughout this project to accurately define the limits of mapped areas, build a common mapping platform from various client products, track production progress, allocate resources and relate subsurface geophysical information to geographical features for use in rapidly reacquiring targets for investigation. We were hopeful that our platform could meet the proposed 35 acres per day, towing both a geophysical package and a radiological monitoring trailer. We held our breath and crossed our fingers as the autonomous Speedrower began to crawl across the playa lakebed. We met our proposed production rate, and we averaged just less than 50 acres per 12-hour day using the autonomous platform with a path tracking error of less than +/- 4 inches. Our project team mapped over 1,800 acres in an 8-week (4 days per week) timeframe. The expertise of our partner, Carnegie Mellon University, was recently demonstrated when their two autonomous vehicle entries finished second and third at the 2005 Defense Advanced Research Projects Agency (DARPA) Grand Challenge. 'The Grand Challenge program was established to help foster the development of autonomous vehicle technology that will some day help save the lives of Americans who are protecting our country on the battlefield', said DARPA Grand Challenge Program Manager, Ron Kurjanowicz. Our autonomous remote-controlled vehicle (ARCV) was a modified New Holland 2550 Speedrower retrofitted to allow the machine-actuated functions to be controlled by an onboard computer. The computer-controlled Speedrower was developed at Carnegie Mellon University to automate agricultural harvesting. Harvesting tasks require the vehicle to cover a field using minimally overlapping rows at slow speeds in a similar manner to geophysical data acquisition. The Speedrower had demonstrated its ability to perform as it had already logged hundreds of acres of autonomous harvesting. This project is the first use of autonomous robotic technology on a large-scale for geophysical surveying.

  14. A TECHNICAL SURVEY OF HARMONIC ANALYSIS TERENCE TAO

    E-print Network

    Tao, Terence

    A TECHNICAL SURVEY OF HARMONIC ANALYSIS TERENCE TAO-variable harmonic analysis, and some of the techni* *ques being developed. Note: references are completely broadly speaking, harmonic analysis is centered around the analysis (in pa* *r- ticular, quantitative

  15. A Survey of Quantitative Descriptions of Molecular Structure

    PubMed Central

    Guha, Rajarshi; Willighagen, Egon

    2013-01-01

    Numerical characterization of molecular structure is a first step in many computational analysis of chemical structure data. These numerical representations, termed descriptors, come in many forms, ranging from simple atom counts and invariants of the molecular graph to distribution of properties, such as charge, across a molecular surface. In this article we first present a broad categorization of descriptors and then describe applications and toolkits that can be employed to evaluate them. We highlight a number of issues surrounding molecular descriptor calculations such as versioning and reproducibility and describe how some toolkits have attempted to address these problems. PMID:23110530

  16. The Role of Introductory Geosciences in Students' Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Manduca, C.; Baer, E. M.

    2006-12-01

    Quantitative literacy is more than mathematics; it is about reasoning with data. Colleges and universities have begun to recognize the distinction between mathematics and quantitative literacy, modifying curricula to reflect the need for numerate citizens. Although students may view geology as 'rocks for jocks', the geosciences are truthfully rife with data, making introductory geoscience topics excellent context for developing the quantitative literacy of students with diverse backgrounds. In addition, many news items that deal with quantitative skills, such as the global warming phenomenon, have their basis in the Earth sciences and can serve as timely examples of the importance of quantitative literacy for all students in introductory geology classrooms. Participants at a workshop held in 2006, 'Infusing Quantitative Literacy into Introductory Geoscience Courses,' discussed and explored the challenges and opportunities associated with the inclusion of quantitative material and brainstormed about effective practices for imparting quantitative literacy to students with diverse backgrounds. The tangible results of this workshop add to the growing collection of quantitative materials available through the DLESE- and NSF-supported Teaching Quantitative Skills in the Geosciences website, housed at SERC. There, faculty can find a collection of pages devoted to the successful incorporation of quantitative literacy in introductory geoscience. The resources on the website are designed to help faculty to increase their comfort with presenting quantitative ideas to students with diverse mathematical abilities. A methods section on "Teaching Quantitative Literacy" (http://serc.carleton.edu/quantskills/methods/quantlit/index.html) focuses on connecting quantitative concepts with geoscience context and provides tips, trouble-shooting advice and examples of quantitative activities. The goal in this section is to provide faculty with material that can be readily incorporated into existing introductory geoscience courses. In addition, participants at the workshop (http://serc.carleton.edu/quantskills/workshop06/index.html) submitted and modified more than 20 activities and model courses (with syllabi) designed to use best practices for helping introductory geoscience students to become quantitatively literate. We present insights from the workshop and other sources for a framework that can aid in increasing quantitative literacy of students from a variety of backgrounds in the introductory geoscience classroom.

  17. The quantitative methods component in social sciences curricula in view of journal content

    Microsoft Academic Search

    Wim P. M. Vijverberg

    1997-01-01

    What level of quantitative methods (or applied statistical analysis) should graduate students in the social sciences be prepared to master, if they are to be competitive in the job market? In the age of information technology, more data, in survey or other form, about any imaginable topic exist than ever before. Empirical analysis on the basis of more advanced quantitative

  18. Acquisition of technological capability in development: A quantitative case study of Pakistan's capital goods sector

    Microsoft Academic Search

    Henny Romijn

    1997-01-01

    There is by now a substantial body of literature that points to the importance of technological capability acquisition for industrial development. This literature, however, is almost exclusively based on qualitative case studies. This paper addresses the lack of objective quantitative measurement and testing. Using data from a survey among capital goods manufacturers in Pakistan, it develops quantitative proxies for technological

  19. In this paper, we survey various designs of low-power full-adder cells from conventional CMOS to really inven-

    E-print Network

    Al-Asaad, Hussain

    CMOS to really inven- tive XOR-based designs. We further describe simulation experiments that compare and consequently determine the delay and power consumption for the various full-adder cells. Moreover. Keywords: Full-adder cell design, low-power cir- cuits, power and delay estimation, VLSI implementa- tions

  20. The Large Synoptic Survey Telescope

    E-print Network

    Ku?el, Petr

    Synoptic Survey Telescope (LSST) will be a large-aperture, wide-field, ground-based telescope designed interest for cosmology and fundamental physics, LSST will provide tight constraints on the nature of dark considerations that have led to the design of the LSST, and discuss a sampling of the exciting science

  1. CHINA HEALTH AND NUTRITION SURVEY

    EPA Science Inventory

    The China Health and Nutrition Survey is designed to examine the effects of health, nutrition, and family planning policies and programs as they have been implemented by national and local governments. It is designed to examine how both the social and economic transformation of C...

  2. National Nursing Home Survey

    MedlinePLUS

    ... please visit this page: About CDC.gov . National Nursing Home Survey National Nursing Home Survey About NNHS What's New Survey Methodology, ... Data Files Long-term Care Medication Data National Nursing Assistant Survey Survey Publications and Products Listserv Related ...

  3. Dialect Survey

    NSDL National Science Digital Library

    Sometimes individuals may find themselves wondering: "What do you call the long sandwich that contains cold cuts, lettuce, and so on?." Certainly there is a strong regional variation to this type of sandwich, as some people along the Eastern seaboard may refer to it as a "grinder," people in Louisiana may be more likely to refer to it as a "poor boy," and so on. This rather interesting dialect survey, conducted by Professor Bert Vaux (with his colleagues) at Harvard University, examines the spatial distribution of certain dialect phrases for various objects or phenomena, and also looks at the phonology behind certain words, such as caramel or lawyer. Over 30,000 participants took part in the survey, and visitors to the site can view these dialect maps and learn more about the breakdown of the participants in the survey as well.

  4. The Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey. I. The Survey Design and First Results on CL 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82

    NASA Astrophysics Data System (ADS)

    Lubin, L. M.; Gal, R. R.; Lemaux, B. C.; Kocevski, D. D.; Squires, G. K.

    2009-06-01

    We present the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 h -1 70 Mpc around 20 well-known clusters at redshifts of 0.6 < z < 1.3. The goal of the survey is to examine a statistical sample of dynamically active clusters and large-scale structures in order to quantify galaxy properties over the full range of local and global environments. We describe the survey design, the cluster sample, and our extensive observational data covering at least 25' around each target cluster. We use adaptively smoothed red galaxy density maps from our wide-field optical imaging to identify candidate groups/clusters and intermediate-density large-scale filaments/walls in each cluster field. Because photometric techniques (such as photometric redshifts, statistical overdensities, and richness estimates) can be highly uncertain, the crucial component of this survey is the unprecedented amount of spectroscopic coverage. We are using the wide-field, multiobject spectroscopic capabilities of the Deep Multiobject Imaging Spectrograph to obtain 100-200+ confirmed cluster members in each field. Our survey has already discovered the Cl 1604 supercluster at z ? 0.9, a structure which contains at least eight groups and clusters and spans 13 Mpc × 100 Mpc. Here, we present the results on the large-scale environments of two additional clusters, Cl 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82, which highlight the diversity of global properties at these redshifts. The optically selected Cl 0023+0423 is a four-way group-group merger with constituent groups having measured velocity dispersions between 206 and 479 km s-1. The galaxy population is dominated by blue, star-forming galaxies, with 80% of the confirmed members showing [O II] emission. The strength of the H? line in a composite spectrum of 138 members indicates a substantial contribution from recent starbursts to the overall galaxy population. In contrast, the X-ray-selected RX J1821.6+6827 is a largely isolated, massive cluster with a measured velocity dispersion of 926 ± 77 km s-1. The cluster exhibits a well-defined red sequence with a large quiescent galaxy population. The results from these two targets, along with preliminary findings on other ORELSE clusters, suggest that optical selection may be more effective than X-ray surveys at detecting less-evolved, dynamically active systems at these redshifts.

  5. Quantitative biomedical mass spectrometry

    NASA Astrophysics Data System (ADS)

    de Leenheer, Andrép; Thienpont, Linda M.

    1992-09-01

    The scope of this contribution is an illustration of the capabilities of isotope dilution mass spectrometry (IDMS) for quantification of target substances in the biomedical field. After a brief discussion of the general principles of quantitative MS in biological samples, special attention will be paid to new technological developments or trends in IDMS from selected examples from the literature. The final section will deal with the use of IDMS for accuracy assessment in clinical chemistry. Methodological aspects considered crucial for avoiding sources of error will be discussed.

  6. A Quantitative Occam's Razor

    E-print Network

    Rafael D. Sorkin

    2005-11-29

    This paper derives an objective Bayesian "prior" based on considerations of entropy/information. By this means, it produces a quantitative measure of goodness of fit (the "H-statistic") that balances higher likelihood against the number of fitting parameters employed. The method is intended for phenomenological applications where the underlying theory is uncertain or unknown. For example, it can help decide whether the large angle anomalies in the CMB data should be taken seriously. I am therefore posting it now, even though it was published before the arxiv existed.

  7. Physical Surveys of Over 300 Buildings in Hot and Humid Climates Indicate Material/Design Performance Flaws Exist in Comparison to Expected Results Using Nationally Accepted Standards 

    E-print Network

    Othmer, A. E.

    2000-01-01

    including windows, skylighting, insulation, and major HVAC systems components do not perform as well as expected in installed / finished product state. The end result is buildings designed with calculations taken from standard ASTM and ASHRAE formulas do...

  8. Physical Surveys of Over 300 Buildings in Hot and Humid Climates Indicate Material/Design Performance Flaws Exist in Comparison to Expected Results Using Nationally Accepted Standards

    E-print Network

    Othmer, A. E.

    2000-01-01

    including windows, skylighting, insulation, and major HVAC systems components do not perform as well as expected in installed / finished product state. The end result is buildings designed with calculations taken from standard ASTM and ASHRAE formulas do...

  9. WESF natural phenomena hazards survey

    SciTech Connect

    Wagenblast, G.R., Westinghouse Hanford

    1996-07-01

    A team of engineers conducted a systematic natural hazards phenomena (NPH) survey for the 225-B Waste Encapsulation and Storage Facility (WESF). The survey is an assessment of the existing design documentation to serve as the structural design basis for WESF, and the Interim Safety Basis (ISB). The lateral force resisting systems for the 225-B building structures, and the anchorages for the WESF safety related systems were evaluated. The original seismic and other design analyses were technically reviewed. Engineering judgment assessments were made of the probability of NPH survival, including seismic, for the 225-B structures and WESF safety systems. The method for the survey is based on the experience of the investigating engineers,and documented earthquake experience (expected response) data.The survey uses knowledge on NPH performance and engineering experience to determine the WESF strengths for NPH resistance, and uncover possible weak links. The survey, in general, concludes that the 225-B structures and WESF safety systems are designed and constructed commensurate with the current Hanford Site design criteria.

  10. DIETARY SURVEYS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Dietary surveys are used for multiple purposes. They range from measurement of food disappearance at the national level, to food use at the household level, to detailed multiple assessments of individual intake for linkage with health outcomes. Each of these methods has strengths and limitations, d...

  11. Space Survey

    NSDL National Science Digital Library

    This is a lesson about society and space exploration. Learners will survey the public about their different opinions about space exploration and the use of robotics in space exploration. Then they will represent and analyze the results. This is lesson 5 of 16 in the MarsBots learning module.

  12. Sleep medicine services in Saudi Arabia: The 2013 national survey

    PubMed Central

    Bahammam, Ahmed S.; Alsaeed, Mashni; AlAhmari, Mohammed; AlBalawi, Ibrahim; Sharif, Munir M.

    2014-01-01

    BACKGROUND: We conducted this national survey to quantitatively assess sleep medicine services in the Kingdom of Saudi Arabia (KSA) and to identify obstacles that specialists and hospitals face, precluding the establishment of this service. MATERIALS AND METHODS: A self-administered questionnaire was designed to collect the following: General information regarding each hospital, information regarding sleep medicine facilities (SFs), the number of beds, the number of sleep studies performed and obstacles to the establishment of SFs. The questionnaire and a cover letter explaining the study objectives were mailed and distributed by respiratory care practitioners to 32 governmental hospitals and 18 private hospitals and medical centers in the KSA. RESULTS: The survey identified 18 SFs in the KSA. The estimated per capita number of beds/year/100,000 people was 0.11 and the per capita polysomnography (PSG) rate was 18.0 PSG/year/100,000 people. The most important obstacles to the progress of sleep medicine in the KSA were a lack of trained sleep technologists and a lack of sleep medicine specialists. CONCLUSION: The sleep medicine services provided in the KSA have improved since the 2005 survey; however, these services are still below the level of service provided in developed countries. Organized efforts are needed to overcome the identified obstacles and challenges to the progress of sleep medicine in the KSA. PMID:24551019

  13. Quantitative hypermethylation of a small panel of genes augments the diagnostic accuracy in fine-needle aspirate washings of breast lesions

    Microsoft Academic Search

    Carmen Jeronimo; Paula Monteiro; Rui Henrique; Mário Dinis-Ribeiro; Isabel Costa; Vera L. Costa; Luísa Filipe; André L. Carvalho; Mohammad O. Hoque; Irene Pais; Conceição Leal; Manuel R. Teixeira; David Sidransky

    2008-01-01

    Purpose  We hypothesized that comprehensive breast cancer methylation profiling might provide biomarkers for diagnostic assessment\\u000a of suspicious breast lesions using fine needle aspiration biopsy (FNA).\\u000a \\u000a \\u000a \\u000a Experimental design  Twenty-three gene promoters were surveyed by quantitative methylation-specific PCR in bisulfite-modified DNA from 66 breast\\u000a carcinomas (BCa), 31 fibroadenomas (FB) and 12 normal breast (NT) samples to define a set of genes differentially methylated\\u000a in

  14. Does Survey Medium Affect Responses? An Exploration of Electronic and Paper Surveying in British Colombia Schools

    ERIC Educational Resources Information Center

    Walt, Nancy; Atwood, Kristin; Mann, Alex

    2008-01-01

    The purpose of this study was to determine whether or not survey medium (electronic versus paper format) has a significant effect on the results achieved. To compare survey media, responses from elementary students to British Columbia's Satisfaction Survey were analyzed. Although this study was not experimental in design, the data set served as a…

  15. FWF/SPECTRA EXECUTIVE SUMMARY OF THE FWF SURVEY 2002

    E-print Network

    Blatt, Rainer

    . Of the 12,887 scientists contacted, 3,147 took part in the survey. This figure corresponds to a response- 1 - FWF/SPECTRA EXECUTIVE SUMMARY OF THE FWF SURVEY 2002 DESIGN OF THE SURVEY To improve its service, the FWF commissioned a wide-ranging survey among persons engaged in scientific activities

  16. Applied Sampling and Surveying Instructor: Stephanie Eckman, Ph.D.

    E-print Network

    Gerkmann, Ralf

    Applied Sampling and Surveying Instructor: Stephanie Eckman, Ph.D. terpconnect and Analysis Leslie Kish Survey Sampling This class will provide students with practical methods for survey practical topics that students will encounter when designing and carrying out surveys. Topics will include

  17. MHEC Academic Scheduling Software Survey Results.

    ERIC Educational Resources Information Center

    Midwestern Higher Education Commission Academic Software Committee Research Bulletin, 1995

    1995-01-01

    This bulletin summarizes the chief quantitative findings of a survey of 264 small and medium sized colleges and universities in the midwest concerning their use of and interest in academic scheduling software. This type of software assists in planning course offerings, assigning instructors and course functions to facilities and time slots, and…

  18. Intelligent Tutoring Systems: A Tutorial Survey.

    ERIC Educational Resources Information Center

    Clancey, William J.

    This survey of intelligent tutoring systems describes the components of these systems, different teaching scenarios, and the relation of these systems to a theory of instruction. It argues that the underlying pedagogical approach is to make latent knowledge manifest by using different forms of quantitative modeling: (1) simulating physical…

  19. The Imaging and Slitless Spectroscopy Instrument for Surveys (ISSIS) for the World Space Observatory--Ultraviolet (WSO-UV): optical design, performance and verification tests.

    NASA Astrophysics Data System (ADS)

    Gómez de Castro, A. I.; Perea, B.; Sánchez, N.; Chirivella, J.; Seijas, J.

    2015-05-01

    ISSIS is the instrument for imaging and slitless spectroscopy on-board WSO-UV. The baseline for ISSIS design, as approved at the PDR held in May 2012, consists of two acquisition channels, both of them provided with photon counting detectors with Micro-Channel Plates (MCP). These two channels are named the Far Ultraviolet (FUV) Channel covering the 1150-1750 Å wavelength range and the Near Ultraviolet (NUV) Channel in the 1850-3200 Å range. In this work, we present the current ISSIS design and its main characteristics. We present the main performance verification for ISSIS to ensure that the current design of ISSIS fulfils the scientific requirements and to ensure the feasibility of the in-flight calibration. We also define the facilities and technical characteristics for realizing the tests.

  20. Geophex airborne unmanned survey system

    SciTech Connect

    Won, I.J.; Taylor, D.W.A.

    1995-03-01

    The purpose of this effort is to design, construct, and evaluate a portable, remotely-piloted, airborne, geophysical survey system. This nonintrusive system will provide {open_quotes}stand-off{close_quotes} capability to conduct surveys and detect buried objects, structures, and conditions of interest at hazardous locations. This system permits two operators to rapidly conduct geophysical characterization of hazardous environmental sites. During a survey, the operators remain remote from, but within visual distance, of, the site. The sensor system never contacts the Earth, but can be positioned near the ground so that weak anomalies can be detected.

  1. Quantitative Mineralogical Characterization of Oregon Erionite

    NASA Astrophysics Data System (ADS)

    Dogan, A.; Dogan, M.; Ballirano, P.

    2006-12-01

    Erionite has been classified as Group-I Human Carcinogen by the IARC Working Group. Fibrogenetic potential of erionite varies from low to high yield of mesothelioma. This may require quantitative characterization of physicochemical properties of erionite before any experimental design. The toxicity of the mineral is such that quantitative characterization of erionite is extremely important. Yet, often the erionite specimens were incompletely or incorrectly characterized throwing doubt on the results of the work. For example, none of the Turkish erionite published until recently had balance error (E%) less than 10%, and Mg cation of the type specimen of erionite-Ca from Maze, Niigita Prefecture, Japan is more than 0.8. In the present study, erionite sample near Rome, Oregon have been quantitatively characterized using powder x-ray diffraction, Reitveld refinement, scanning electron microscopy, energy dispersive spectroscopy, inductively coupled plasma - mass spectroscopy, and Massbauer spectroscopy. The cell parameters of the erionite-K from Oregon is computed as a=13.2217(2) Å and c=15.0671 Å; chemical composition of the erionite as major oxides, rare earth elements and other trace elements, are characterized quantitatively. Crystal chemistries of the erionite are computed based upon the quidelines of the IMAA zeolite report of 1997.

  2. Implementing total quality management : Statistical analysis of survey results

    Microsoft Academic Search

    Zinovy D. Radovilsky; J. William Gotcher; Sverre Slattsveen

    1996-01-01

    Describes the results of a survey in manufacturing, distribution and service organizations concerning total quality management (TQM). These results proved that an effective TQM programme should contain consistent training of all employees; significant improvement in communication between departments; and development of the standards to measure and control the cost of quality. Based on the survey results, identifies quantitative relationships between

  3. Biomedical imaging ontologies: A survey and proposal for future work

    PubMed Central

    Smith, Barry; Arabandi, Sivaram; Brochhausen, Mathias; Calhoun, Michael; Ciccarese, Paolo; Doyle, Scott; Gibaud, Bernard; Goldberg, Ilya; Kahn, Charles E.; Overton, James; Tomaszewski, John; Gurcan, Metin

    2015-01-01

    Background: Ontology is one strategy for promoting interoperability of heterogeneous data through consistent tagging. An ontology is a controlled structured vocabulary consisting of general terms (such as “cell” or “image” or “tissue” or “microscope”) that form the basis for such tagging. These terms are designed to represent the types of entities in the domain of reality that the ontology has been devised to capture; the terms are provided with logical definitions thereby also supporting reasoning over the tagged data. Aim: This paper provides a survey of the biomedical imaging ontologies that have been developed thus far. It outlines the challenges, particularly faced by ontologies in the fields of histopathological imaging and image analysis, and suggests a strategy for addressing these challenges in the example domain of quantitative histopathology imaging. Results and Conclusions: The ultimate goal is to support the multiscale understanding of disease that comes from using interoperable ontologies to integrate imaging data with clinical and genomics data.

  4. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  5. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  6. EXPERIMENTAL QUANTITATIVE TRANSPORT PROBE AND CONTROL BOX SAMPLING SYSTEM

    EPA Science Inventory

    Three quantitative sampling transport probe and control box sampling systems were designed and fabricated. The systems are designed to permit the transport of samples of aerosols from a source to a sensor without significant modification of mass rate and size distribution of the ...

  7. Qualitative and quantitative cost analysis for sheet metal stamping

    Microsoft Academic Search

    Dunbing Tang; Walter Eversheim; Günther Schuh

    2004-01-01

    This research paper presents a qualitative and quantitative cost analysis system for sheet metal stamping development at an early design stage. First, the authors identify problems in the traditional metal stamping part and die development processes and outline the need for performance of concurrent stamping part design in order to achieve cost effectiveness that accommodates the concept of concurrent engineering.

  8. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  9. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  10. Quality of data in multiethnic health surveys.

    PubMed Central

    Pasick, R. J.; Stewart, S. L.; Bird, J. A.; D'Onofrio, C. N.

    2001-01-01

    OBJECTIVE: There has been insufficient research on the influence of ethno-cultural and language differences in public health surveys. Using data from three independent studies, the authors examine methods to assess data quality and to identify causes of problematic survey questions. METHODS: Qualitative and quantitative methods were used in this exploratory study, including secondary analyses of data from three baseline surveys (conducted in English, Spanish, Cantonese, Mandarin, and Vietnamese). Collection of additional data included interviews with investigators and interviewers; observations of item development; focus groups; think-aloud interviews; a test-retest assessment survey; and a pilot test of alternatively worded questions. RESULTS: The authors identify underlying causes for the 12 most problematic variables in three multiethnic surveys and describe them in terms of ethnic differences in reliability, validity, and cognitive processes (interpretation, memory retrieval, judgment formation, and response editing), and differences with regard to cultural appropriateness and translation problems. CONCLUSIONS: Multiple complex elements affect measurement in a multiethnic survey, many of which are neither readily observed nor understood through standard tests of data quality. Multiethnic survey questions are best evaluated using a variety of quantitative and qualitative methods that reveal different types and causes of problems. PMID:11889288

  11. The challenge of using survey research to design and evaluate a water quality project: A case study in Jackson County, Florida

    SciTech Connect

    Purvis, A.; Boggess, W.G.; Graham, W.D.; Holt, J.; Hewitt, T.D.

    1992-12-01

    The character of agriculture and of nonpoint pollution vary across regions. Under the national Water Quality Initiative, ninety grassroots pilot programs were established-opportunities to devise targeted local education and technical assistance schemes. Data collection to support design and evaluation of one project is described, illustrating challenges and opportunities in water policy programming.

  12. EMAP - WEST COMMUNICATIONS, SURVEY DESIGNS FOR SAMPLING SURFACE WATER CONDITION IN THE WEST (ONE-PAGE BRIEFING PAPER ON EMAP-WEST ACTIVITIES)

    EPA Science Inventory

    To meet the requirements of the Clean Water Act, States and Tribes must provide a statement of condition of their surface water resources. The design used in EMAP-West will allow these requirements to be met in an efficient as well as scientifically defensible manner. The desi...

  13. Survey research: it's just a few questions, right?

    PubMed

    Tait, Alan R; Voepel-Lewis, Terri

    2015-07-01

    While most anesthesiologists and other physician- or nurse-scientists are familiar with traditional descriptive, observational, and interventional study design, survey research has typically remained the preserve of the social scientists. To that end, this article provides a basic overview of the elements of good survey design and offers some rules of thumb to help guide investigators through the survey process. PMID:25929546

  14. Survey Expectations

    E-print Network

    Pesaran, M Hashem; Weale, Martin

    2006-03-14

    in the Handbook of Economic Forecasting, G. Elliott, C.W.J. Granger, and A. Timmermann (eds.), North-Holland (forthcoming 2006). Helpful comments by two anonymous referees, Kajal Lahiri and Ron Smith are gratefully acknowledged. 1 Contents 1 Introduction 4 2 Part... ) more than exponentially. The error-correction and the general extrapolation model are algebraically equivalent, but the former is particularly convenient when survey data is available on expectations over different horizons. 15 2.3 Testable Implications...

  15. Protocol for a national, mixed-methods knowledge, attitudes and practices survey on non-communicable diseases

    PubMed Central

    2011-01-01

    Background Mongolia is undergoing rapid epidemiological transition with increasing urbanisation and economic development. The lifestyle and health of Mongolians are changing as a result, shown by the 2005 and 2009 STEPS surveys (World Health Organization's STEPwise Approach to Chronic Disease Risk Factor Surveillance) that described a growing burden of Non-Communicable Diseases and injuries (NCDs). This study aimed to assess, describe and explore the knowledge, attitudes and practices of the Mongolian adult population around NCDs in order to better understand the drivers and therefore develop more appropriate solutions to this growing disease burden. In addition, it aimed to provide data for the evaluation of current public health programs and to assist in building effective, evidence-based health policy. Methods/design This national survey consisted of both quantitative and qualitative methods. A quantitative household-based questionnaire was conducted using a nationally representative sample of 3854 rural and urban households. Participants were selected using a multi-stage cluster sampling technique in 42 regions across Mongolia, including rural and urban sites. Permanent residents of sampled households were eligible for recruitment, if aged between 15-64 years. This quantitative arm was then complemented and triangulated with a qualitative component: twelve focus group discussions focusing on diet, exercise and alcohol consumption. Discussions took place in six sites across the country, facilitated by local, trained health workers. These six sites were chosen to reflect major Mongolian cultural and social groups. Discussion KAP surveys are well represented in the literature, but studies that aim to explore the knowledge, attitudes and practices of a population around NCDs remain scarce. This is despite the growing number of national epidemiological surveys, such as STEPS, which aim to quantify the burden of these diseases but do not explore the level of population-based awareness, understanding, risk-perception and possible motivation for change. Therefore this paper will contribute to building a knowledge base of NCD KAP survey methodology for future use in epidemiology and research worldwide. PMID:22208645

  16. EASTERN LAKE SURVEY-PHASE II AND NATIONAL STREAM SURVEY-PHASE I PROCESSING LABORATORY OPERATIONS REPORT

    EPA Science Inventory

    The National Surface Water Survey was designed to characterize surface water chemistry in regions of the United States believed to be potentially sensitive to acidic deposition. The National Stream Survey was a synoptic survey designed to quantify the chemistry of streams in area...

  17. Monitoring and design of stormwater control basins

    USGS Publications Warehouse

    Veenhuis, J.E.; Parrish, J.H.; Jennings, M.E.

    1989-01-01

    The City of Austin, Texas, has played a pioneering role in the control of urban nonpoint source pollution by enacting watershed and stormwater ordinances, overseeing detailed monitoring programs, and improving design criteria for stormwater control methods. The effectiveness of the methods used in Austin, and perhaps in other areas of the United States, to protect urban water resources has not yet been fully established. Therefore, detailed monitoring programs capable of quantitatively determining the effectiveness of control methods and of stormwater ordinances, are required. The purpose of this report is to present an overview of the City of Austin's stormwater monitoring program, including previous monitoring programs with the U.S. Environmental Protection Agency and the U.S. Geological Survey, and to describe the relation of monitoring to design of stormwater control basins.

  18. Quantitative spectroscopic imaging for noninvasive early cancer detection

    PubMed Central

    Yu, Chung-Chieh; Lau, Condon; O’Donoghue, Geoff; Mirkovic, Jelena; McGee, Sasha; Galindo, Luis; Elackattu, Alphi; Stier, Elizabeth; Grillone, Gregory; Badizadegan, Kamran; Dasari, Ramachandra R.; Feld, Michael S.

    2008-01-01

    We report a fully quantitative spectroscopy imaging instrument for wide area detection of early cancer (dysplasia). This instrument provides quantitative maps of tissue biochemistry and morphology, making it a potentially powerful surveillance tool for objective early cancer detection. We describe the design, construction, calibration, and first clinical application of this new system. We demonstrate its accuracy using physical tissue models. We validate its diagnostic ability on a resected colon adenoma, and demonstrate feasibility of in vivo imaging in the oral cavity. PMID:18825262

  19. Visual Design Principles: An Empirical Study of Design Lore

    ERIC Educational Resources Information Center

    Kimball, Miles A.

    2013-01-01

    Many books, designers, and design educators talk about visual design principles such as balance, contrast, and alignment, but with little consistency. This study uses empirical methods to explore the lore surrounding design principles. The study took the form of two stages: a quantitative literature review to determine what design principles are…

  20. A Multiwavelength Exploration of the Grand Design Spiral M83: The HST/WFC3 Continuum and Emission-line Imaging Survey

    NASA Astrophysics Data System (ADS)

    Blair, William P.; Long, K. S.; Winkler, P. F.; Kuntz, K. D.; Whitmore, B. C.; Soria, R.; Dopita, M. A.; Ghavamian, P.; Chandar, R.; Rangelov, B.

    2013-01-01

    As part of HST Cy19 program 12513, we have obtained WFC3 UVIS and IR camera data of five new fields in the face-on spiral M83 which, combined with two existing fields obtained in 2009 and 2010, nearly cover the entire bright disk and spiral arms. Broadband U, B, V, I, and H will permit us to characterize the ages of 100’s of star clusters as well as the general stellar disk populations as a function of spatial position. The broadband data can also be scaled and subtracted from our narrowband Ha, [S II], [O III] and [Fe II] 1.64 micron images to allow us to find and measure 100’s of supernova remnants, planetary nebulae, and other emission nebulae. Comparison of these data with each other and with deep (729 ks) Chandra data and new radio surveys with JVLA and ATCA will allow us to search for and characterize optical counterparts to many of the X-ray sources, including especially supernova remnants and X-ray binaries, and estimate the ages of the underlying host stellar populations. We will provide an overview of this rich data set and show initial results from these comparisons. This work is supported by STScI grant HST-GO-12513.01-A to the Johns Hopkins University.

  1. Quantitative sensory testing.

    PubMed

    Siao, Peter; Cros, Didier P

    2003-05-01

    Quantitative sensory testing is a reliable way of assessing large and small sensory nerve fiber function. Sensory deficits may be quantified and the data used in parametric statistical analysis in research studies and drug trials. It is an important addition to the neurophysiologic armamentarium, because conventional sensory nerve conduction tests only the large fibers. QST is a psychophysical test and lacks the objectivity of NCS. The results are subject to changes owing to distraction, boredom, mental fatigue, drowsiness, or confusion. When patients are consciously or unconsciously biased toward an abnormal QST result, no psychophysical testing can reliably distinguish these patients from those with organic disease. QST tests the integrity of the entire sensory neuraxis and is of no localizing value. Dysfunction of the peripheral nerves or central nervous system may give rise to abnormalities in QST. As is true for other neurophysiologic tests, QST results should always be interpreted in light of the patient's clinical presentation. Quantitative sensory testing has been shown to be reasonably reproducible over a period of days or weeks in normal subjects. Because longitudinal QST studies of patients in drug trials are usually performed over a period of several months to a few years, reproducibility studies on the placebo-control group should be included. For individual patients, more studies are needed to determine the maximum allowable difference between two QSTs that can be attributed to experimental error. The reproducibility of thermal thresholds may not be as good as that of vibration threshold. Different commercially available QST instruments have different specifications (thermode size, stimulus characteristics), testing protocols, algorithms, and normal values. Only QST instruments and their corresponding methodologies that have been shown to be reproducible should be used for research and patient care. The data in the literature do not allow conclusions regarding the superiority of any QST instruments. The future of QST is promising; however, many factors can affect QST results. As is true for other neurophysiologic tests, QST is susceptible to many extraneous factors and to misuse when not properly interpreted by the clinician. PMID:12795516

  2. Open-source social Network Assessment Survey System (NASS)

    E-print Network

    Du, Aaron (Aaron Yinan)

    2005-01-01

    The selection of targeted survey questions and the design of survey questionnaires are instrumental in the social networks research. With the accelerating growth of theory and experimental knowledge in the area of social ...

  3. Quantitive DNA Fiber Mapping

    SciTech Connect

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  4. Infrastructure Survey 2011

    ERIC Educational Resources Information Center

    Group of Eight (NJ1), 2012

    2012-01-01

    In 2011, the Group of Eight (Go8) conducted a survey on the state of its buildings and infrastructure. The survey is the third Go8 Infrastructure survey, with previous surveys being conducted in 2007 and 2009. The current survey updated some of the information collected in the previous surveys. It also collated data related to aspects of the…

  5. Quantitative Proteomics Isotope Coding Proteomics

    E-print Network

    Richardson, David

    Quantitative Proteomics Isotope Coding Proteomics ­ In-vitro labeling » ICAT » Acid cleavable ICAT quantitation Increased isotope spacing ­ 9 Daltons rather than 8 Daltons ­ Less interference from oxidation;Stable Isotope Labeling by Amino Acids in Cell Culture, SILAC, as a Simple and Accurate Approach

  6. Quantitative epidemiology: Progress and challenges

    Microsoft Academic Search

    Ian R. Dohoo

    2008-01-01

    This manuscript is derived from a presentation at the 2006 AVEPM – Schwabe Symposium which honoured the 2006 recipient of the Calvin Schwabe Award – Dr. S. Wayne Martin. Throughout his career, Dr. Martin was instrumental in furthering the development of quantitative epidemiology. This manuscript highlights some of the recent advances in quantitative methods used in veterinary epidemiology and identifies

  7. Workshop on quantitative dynamic stratigraphy

    SciTech Connect

    Cross, T.A.

    1988-04-01

    This document discusses the development of quantitative simulation models for the investigation of geologic systems. The selection of variables, model verification, evaluation, and future directions in quantitative dynamic stratigraphy (QDS) models are detailed. Interdisciplinary applications, integration, implementation, and transfer of QDS are also discussed. (FI)

  8. Quantitatively Analyzing Stealthy Communication Channels

    E-print Network

    Ryder, Barbara G.

    Quantitatively Analyzing Stealthy Communication Channels Patrick Butler, Kui Xu, and Danfeng. Understand- ing the capacity of such communication channels is important in detect- ing organized cyber and quantitatively analyze new techniques that can be used to hide malicious DNS activities both at the host

  9. Quantitative analysis of software architectures

    Microsoft Academic Search

    Simonetta Balsamo; Marco Bernardoand; Vincenzo Grassi

    Quantitative analysis of software systems is a critical issue in the development of applications for heterogeneous distributed and mobile systems. It has been recognised that performance analysis should be integrated in the software development life cycle since the early stages. We focus on quantitative analysis of software architectures (SA) and in particular on performance models and languages to represent, evaluate

  10. Helping Students Become Quantitatively Literate

    ERIC Educational Resources Information Center

    Piatek-Jimenez, Katrina; Marcinek, Tibor; Phelps, Christine M.; Dias, Ana

    2012-01-01

    In recent years, the term "quantitative literacy" has become a buzzword in the mathematics community. But what does it mean, and is it something that should be incorporated into the high school mathematics classroom? In this article, the authors will define quantitative literacy (QL), discuss how teaching for QL differs from teaching a traditional…

  11. Quantitative bedrock geology of the conterminous United States of America

    Microsoft Academic Search

    Bernhard Peucker-Ehrenbrink; Mark W. Miller

    2002-01-01

    We quantitatively analyze the area-age distribution of bedrock based on data from the most recent geologic map of the conterminous United States of America [King and Beikman, 1974a, 1974b], made available in digital form by the United States Geologic Survey. The area-age distribution agrees surprisingly well with older data [Higgs, 1949] but provides much higher temporal resolution. The mean stratigraphic

  12. Quantitative analysis of spirality in elliptical galaxies

    NASA Astrophysics Data System (ADS)

    Dojcsak, Levente; Shamir, Lior

    2014-04-01

    We use an automated galaxy morphology analysis method to quantitatively measure the spirality of galaxies classified manually as elliptical. The data set used for the analysis consists of 60,518 galaxy images with redshift obtained by the Sloan Digital Sky Survey (SDSS) and classified manually by Galaxy Zoo, as well as the RC3 and NA10 catalogues. We measure the spirality of the galaxies by using the Ganalyzer method, which transforms the galaxy image to its radial intensity plot to detect galaxy spirality that is in many cases difficult to notice by manual observation of the raw galaxy image. Experimental results using manually classified elliptical and S0 galaxies with redshift <0.3 suggest that galaxies classified manually as elliptical and S0 exhibit a nonzero signal for the spirality. These results suggest that the human eye observing the raw galaxy image might not always be the most effective way of detecting spirality and curves in the arms of galaxies.

  13. 42 CFR 431.610 - Relations with standard-setting and survey agencies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Relations with standard-setting and survey agencies. 431.610 Section 431... Relations with standard-setting and survey agencies. (a) Basis and purpose...upon request. (e) Designation of survey agency. The plan must provide...

  14. 77 FR 15722 - Southern California Hook and Line Survey; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-16

    ...Southern California Hook and Line Survey review meeting will be...Southern California Hook and Line Survey review meeting will be...Southern California Hook and Line survey design and protocols; (2) examine the analytical methods used to generate rockfish...

  15. INTERNATIONAL STUDENT SURVEY INTERNATIONAL STUDENT SURVEY

    E-print Network

    INTERNATIONAL STUDENT SURVEY OCT 2010 1 SO L I D U MPE- internati- ona l s INTERNATIONAL STUDENT SURVEY SUMMARY AND RESULTS OCTOBER 2010 au AARHUS UNIVERSITET #12;Aarhus universitet International Centre;INTERNATIONAL STUDENT SURVEY OCT 2010 3 Introduction Survey Objectives

  16. Laser Surveying

    NASA Technical Reports Server (NTRS)

    1978-01-01

    NASA technology has produced a laser-aided system for surveying land boundaries in difficult terrain. It does the job more accurately than conventional methods, takes only one-third the time normally required, and is considerably less expensive. In surveying to mark property boundaries, the objective is to establish an accurate heading between two "corner" points. This is conventionally accomplished by erecting a "range pole" at one point and sighting it from the other point through an instrument called a theodolite. But how do you take a heading between two points which are not visible to each other, for instance, when tall trees, hills or other obstacles obstruct the line of sight? That was the problem confronting the U.S. Department of Agriculture's Forest Service. The Forest Service manages 187 million acres of land in 44 states and Puerto Rico. Unfortunately, National Forest System lands are not contiguous but intermingled in complex patterns with privately-owned land. In recent years much of the private land has been undergoing development for purposes ranging from timber harvesting to vacation resorts. There is a need for precise boundary definition so that both private owners and the Forest Service can manage their properties with confidence that they are not trespassing on the other's land.

  17. Quantitative Brain PET

    Microsoft Academic Search

    Vijay Dhawan; Ken Kazumata; William Robeson; Abdelfatihe Belakhlef; Claude Margouleff; Thomas Chaly; Toshitaka Nakamura; Robert Dahl; Donald Margouleff; David Eidelberg

    1998-01-01

    Purpose: Recent developments in the design of positron emission tomography (PET) scanners have made three-dimensional (3D) data acquisition attractive because of significantly higher sensitivity compared to the conventional 2D mode (with lead\\/tungsten septa extended). However, the increased count rate in 3D mode comes at the cost of increased scatter, randoms, and dead time. Several schemes to correct for these effects

  18. Quantitative Luminescence Imaging System

    SciTech Connect

    Batishko, C.R.; Stahl, K.A.; Fecht, B.A.

    1992-12-31

    The goal of the MEASUREMENT OF CHEMILUMINESCENCE project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  19. The UKIRT Infrared Deep Sky Survey (UKIDSS)

    E-print Network

    A. Lawrence; S. J. Warren; O. Almaini; A. C. Edge; N. C. Hambly; R. F. Jameson; P. Lucas; M. Casali; A. Adamson; S. Dye; J. P. Emerson; S. Foucaud; P. Hewett; P. Hirst; S. T. Hodgkin; M. J. Irwin; N. Lodieu; R. G. McMahon; C. Simpson; I. Smail; D. Mortlock; M. Folger

    2007-08-15

    We describe the goals, design, implementation, and initial progress of the UKIRT Infrared Deep Sky Survey (UKIDSS), a seven year sky survey which began in May 2005, using the UKIRT Wide Field Camera. It is a portfolio of five survey components covering various combinations of the filter set ZYJHK and H_2. The Large Area Survey, the Galactic Clusters Survey, and the Galactic Plane Survey cover approximately 7000 square degrees to a depth of K~18; the Deep Extragalactic Survey covers 35 square degrees to K~21, and the Ultra Deep Survey covers 0.77 square degrees to K~23. Summed together UKIDSS is 12 times larger in effective volume than the 2MASS survey. The prime aim of UKIDSS is to provide a long term astronomical legacy database; the design is however driven by a series of specific goals -- for example to find the nearest and faintest sub-stellar objects; to discover Population II brown dwarfs, if they exist; to determine the substellar mass function; to break the z=7 quasar barrier; to determine the epoch of re-ionisation; to measure the growth of structure from z=3 to the present day; to determine the epoch of spheroid formation; and to map the Milky Way through the dust, to several kpc. The survey data are being uniformly processed, and released in stages through the WFCAM Science Archive (WSA : http://surveys.roe.ac.uk/wsa). Before the formal survey began, UKIRT and the UKIDSS consortium collaborated in obtaining and analysing a series of small science verification (SV) projects to complete the commissioning of the camera. We show some results from these SV projects in order to demonstrate the likely power of the eventual complete survey. Finally, using the data from the First Data Release we assess how well UKIDSS is meeting its design targets so far.

  20. Policy Implications from: -Charging Surveys

    E-print Network

    California at Davis, University of

    Q C #12;Statewide Quick Charging Choices #12;Workplace Charging #12;Free Charging #12;Congestion #12Policy Implications from: -Charging Surveys -Charging models IEEE November 7, 2013 Michael Nicholas Spatial & Temporal PEV Energy Use · GIS analysis · Charging network design · CEC, Nissan, ECOtality

  1. A second generation survey AUV

    Microsoft Academic Search

    J. G. Bellingham; C. A. Goudey; T. R. Consi; J. W. Bales; D. K. Atwood; J. J. Leonard; C. Chryssostomidis

    1994-01-01

    Odyssey class autonomous underwater vehicles (AUVs) are designed to be small, high performance survey platforms. The logistical complexities of operating off of oceanographic vessels or in hostile environments, such as the Arctic, make a small vehicle with minimal support requirements extremely attractive. Although built for great depths and endurances of up to two days, Odyssey class vehicles are small by

  2. Pulsewidth modulation-a survey

    Microsoft Academic Search

    Joachim Holtz

    1992-01-01

    Pulse-width modulation (PWM) is surveyed with reference to performance criteria, feedforward schemes, and feedback PWM control. It is stressed that the implementation of PWM techniques in the design of AC motor drive systems depends on the machine type, the power level, and the semiconductor devices used in the power converter. It is ultimately performance and cost criteria which determine the

  3. Civil Technology. Surveying. Post Secondary Curriculum Guide.

    ERIC Educational Resources Information Center

    Fitzpatrick, Beverley J.; And Others

    This curriculum guide was designed for use in postsecondary civil technology--surveying education programs in Georgia. Its purpose is to provide for development of entry level skills in surveying in the areas of knowledge, theoretical structure, tool usage, diagnostic ability, related supportive skills, and occupational survival skills. The first…

  4. Colorado Youth Risk Behavior Survey, 1993.

    ERIC Educational Resources Information Center

    Utah Univ., Salt Lake City. Health Education Dept.

    This report describes the results of a survey conducted in 1993. The report was written to stimulate useful discussions among educators, parents, and youth about ways to increase informed support for effective, school-based comprehensive health programs. The survey was designed by national experts to measure the extent to which adolescents engage…

  5. NATIONAL SURVEY FOR AMBULATORY SURGERY (NSAS)

    EPA Science Inventory

    The National Survey of Ambulatory Surgery (NSAS), which was initiated by the National Center for Health Statistics in 1994, is a national survey designed to meet the need for information about the use of ambulatory surgery services in the United States. For NSAS, ambulatory surge...

  6. Career Development Needs of Women. Survey.

    ERIC Educational Resources Information Center

    Economic and Social Opportunities, Inc., San Jose, CA.

    A survey was conducted to define the career development needs of women in five school districts which form the Metropolitan Adult Education Program (MAEP) area (San Jose, California). (The survey was a first step in a project to demonstrate the transferability of existing career development programs from other school areas to designated need…

  7. Statistical Disclosure Control for Survey Data

    Microsoft Academic Search

    Chris Skinner

    2009-01-01

    Statistical disclosure control refers to the methodology used in the design of the statistical outputs from a survey for protecting the confidentiality of respondents' answers. The threat to confidentiality is assumed to come from a hypothetical intruder who has access to these outputs and seeks to use them to disclose information about a survey respondent. One key concern relates to

  8. NATIONAL LONGITUDINAL ALCOHOL EPIDEMIOLOGIC SURVEY (NLAES)

    EPA Science Inventory

    The NLAES is a household survey of 42,862 persons 18 years and older in the coterminous United States. The survey was designed to provide comprehensive information on amounts and patterns of alcohol consumption and on problems associated with alcohol. It is the only nationally-re...

  9. Quantitative tritium imaging

    NASA Astrophysics Data System (ADS)

    Youle, Ian Stuart

    1999-12-01

    Tritium Imaging electrostatically focuses secondary electrons produced at a surface by beta-particles from tritium in the material form an image of the tritiated areas. It has hitherto been essentially a qualitative technique. The research described here examines quantitative aspects of the process. Of particular importance is the effect of depth of tritium on image intensity. For imaging purposes, tritium must obviously be nearer the surface than the maximum range of a beta particle, (about 2.6mum in graphite, and less in heavier materials). Numerical simulation, however, indicates that sensitivity falls off very rapidly with depth, dropping by 50% within the first 100nm. Simulations also indicate that for all but the shallowest tritium, imaging sensitivity drops exponentially with depth. This is experimentally investigated by implanting specimens with tritium ions of a known energy to give a calculated depth distribution. The surfaces of the specimens are sputtered to change the depth distribution, and the sample is imaged at various stages in the process. The evolution of the image intensity of the sputtered region is compared to the predictions of the numerical model. This technique was partially successful in graphite, but not in aluminum, due to the mobility of tritium in aluminum, and the modification of the aluminum surface by oxidation. Imaging was used to explore the nature of these modifications. In other experiments, tritiated graphite specimens were coated with aluminum, and the image intensity was measured as a function of coating thickness. As the tritium was fixed in atomic traps in the graphite, and the aluminum coatings could be assumed to be unoxidized at the time of deposition, problems previously encountered in aluminum were avoided. In these experiments, image intensity decreased exponentially with coating thickness, as predicted by the mathematical models, although slightly less rapidly than expected. It is also demonstrated that for many typical tritium distributions, the limit of lateral resolution will be that intrinsic to the electrostatic lens system, and will be fairly independent of, (and smaller than), the range of the tritium beta. This implies that imaging using secondary electrons produced by other radioisotopes may be possible with reasonable resolution.

  10. What Do We Measure? Methodological versus Institutional Validity in Student Surveys

    ERIC Educational Resources Information Center

    Johnson, Jeffrey Alan

    2011-01-01

    This paper examines the tension in the process of designing student surveys between the methodological requirements of good survey design and the institutional needs for survey data. Building on the commonly used argumentative approach to construct validity, I build an interpretive argument for student opinion surveys that allows assessment of the…

  11. Sexual Experiences Survey: Reliability and Validity.

    ERIC Educational Resources Information Center

    Koss, Mary P.; Gidycz, Christine A.

    1985-01-01

    Describes reliability and validity data for the Sexual Experiences Survey, a self-report instrument designed to identify hidden rape victims and undetected offenders among a normal population. (Author/BH)

  12. Quality Profile for SASS: Aspects of the Quality of Data in the Schools and Staffing Surveys (SASS).

    ERIC Educational Resources Information Center

    Jabine, Thomas B.

    This profile presents and summarizes available information about the quality of data from the five surveys that comprise the SASS, along with background material on the survey design and procedures for the following: (1) School Survey; (2) School Administrator Survey; (3) Teacher Demand and Shortage Survey; (4) Teacher Survey; and (5) Teacher…

  13. Planetary transit surveys

    NASA Astrophysics Data System (ADS)

    Horne, K.

    2002-01-01

    I review the status and prospects of ground-based planetary transit surveys in the era before the Eddington and Kepler missions. Over 70 extra-solar planets have been found to date by high precision radial velocity searches, but so far only one of these exhibits planetary transits. This situation should dramatically reverse in the next few years as photometric searches begin to find large numbers of "Hot Jupiters" that transit in front of their host stars. I discuss and illustrate the methods being used to assess the planet catch of the Eddington mission. Scaling laws are derived to express the planet catch in terms of instrument parameters and planet type. More detailed Monte Carlo simulations yield simulated planet catalogs. Eddington's survey volume for Earth-analog planets extends to d ? 300 pc, and scales with star mass, planet radius and temperature as M-1/8r2T, strongly favouring large hot planets. The Eddington baseline design is likely to deliver only a handful of habitable Earths. The planet catch can be increased by design changes, subject of course to cost and feasibility constraints, to maximize the figure of merit ?d3 ~ ?(N At)3/2, where N is the number of co-aligned telescopes, A and ? are the effective pupil area and solid angle of each telescope, and t is the duration of the mission.

  14. Survey overview Instrument Construction

    E-print Network

    Sheridan, Jennifer

    #12;Outline Survey overview Instrument Construction Survey Logistics Response Rates Uses of Survey (Faculty) and September 2003 (Staff) in SAS datasets #12;Survey Overview ­ Response Rates Response Rates) #12;Survey Overview ­ Response Rates Response Rates (Cont'd) Staff: 47.6% (N=513) Women higher than

  15. Quantitative dynamic SPECT tomography

    SciTech Connect

    Haber, E.; Oldenburg, D. [UBC Geophysical Inversion Facility, Vancouver (Canada); Farnocombe, T.; Celler, A. [Vancouver Hospital and Health Sciences Centre, Vancouver (Canada)

    1996-12-31

    Single Photon Emission Tomography (SPECT) and Positron Emission Tomograph (PET) are two methods in which functional information is obtained. This information is obtained by reconstructing the distribution of tracers. Since the tracer is designed to target specific physiological activity, its distribution contains the information about which areas are physiologically active and which are not. Regular reconstruction methods do not indicate how the activity changes in time. This is the goal of dynamic studies. Dynamic SPECT or PET tomography is the study of the kinetics of a tracer, i.e. how does the tracers` distribution changes in time. Modelling of such behavior suggests that it is an exponentially decaying process that can be characterized with a few exponentials.

  16. Near-Earth Object Survey Orbit Quality Analysis

    NASA Astrophysics Data System (ADS)

    Buie, Marc W.

    2013-05-01

    Abstract (2,250 Maximum Characters): The Sentinel Mission is currently under development by the B612 Foundation and Ball Aerospace. The mission concept is based on a space-based infrared telescope in an independent orbit similar to that of Venus. Being in an orbit interior to the Earth greatly reduces the time to complete the survey of the near-Earth region compared to a similar survey that could be accomplished from ground-based or orbiting observatory near the Earth. One of the key mission design elements is the cadence of observation. This involves the tiling pattern for how the instrument field-of-view maps out the sky and the repeat interval between successive observations. This presentation will show a quantitative analysis of orbit determination from this type of platform and show how the expected distribution of NEOs will be observed and the orbit qualities that will result. From this analysis, limits can then be placed on the degree of confusion that the cadence can tolerate before linking different epochs becomes problematic.

  17. Introduction to the CANDELS Survey

    NASA Astrophysics Data System (ADS)

    Faber, S. M.; Ferguson, H. C.; CANDELS Team

    2012-01-01

    The Cosmic Assembly Near-IR Deep Extragalactic Legacy Survey (CANDELS) is designed to document the first third of galactic evolution, from z 8 to z 1.5. It will image >250,000 distant galaxies using three separate cameras on the Hubble Space Telescope from the mid-UV to near-IR and will find and measure Type Ia supernovae beyond z > 1.5 to test their accuracy as standard candles for cosmology. Five premier sky regions are selected, each with extensive multi-wavelength data. The use of five widely separated fields mitigates cosmic variance and yields statistically robust and complete samples of galaxies down to a stellar mass of 109 solar masses at z 2 and to the knee of the UV luminosity function at z 8. The survey covers approximately 800 arcmin2 and is divided into two parts. The CANDELS/Deep survey (5-sigma point-source limit HAB = 27.7 mag) covers 125 arcmin2 within GOODS-N and GOODS-S on ten separate visits. The CANDELS/Wide survey (5-sigma point-source limit HAB 27.0 mag) images all of GOODS and three additional fields (EGS, COSMOS, and UDS) and covers the full area on two visits. Together with the Hubble Ultradeep Fields, this strategy replicates the "wedding cake” approach that has proven effective for extragalactic surveys. Extensive parallel imaging with the Advanced Camera for Surveys creates a new ACS mosaic in UDS, deepens four existing ones, and provides high-resolution Hubble panchromatic imaging from 0.40 m to 1.6 m. Multiple visits to all fields permit variability studies and supernovae searches, and special deep UV observations cover half of GOODS-N. Data from the survey are non-proprietary and are useful for a wide variety of science investigations. In this talk, we review the scientific goals; observational requirements; field selection, geometry, and observing design; schedule; and the public data products.

  18. The Kilo-Degree Survey

    NASA Astrophysics Data System (ADS)

    de Jong, J. T. A.; Kuijken, K.; Applegate, D.; Begeman, K.; Belikov, A.; Blake, C.; Bout, J.; Boxhoorn, D.; Buddelmeijer, H.; Buddendiek, A.; Cacciato, M.; Capaccioli, M.; Choi, A.; Cordes, O.; Covone, G.; Dall'Ora, M.; Edge, A.; Erben, T.; Franse, J.; Getman, F.; Grado, A.; Harnois-Deraps, J.; Helmich, E.; Herbonnet, R.; Heymans, C.; Hildebrandt, H.; Hoekstra, H.; Huang, Z.; Irisarri, N.; Joachimi, B.; Köhlinger, F.; Kitching, T.; La Barbera, F.; Lacerda, P.; McFarland, J.; Miller, L.; Nakajima, R.; Napolitano, N. R.; Paolillo, M.; Peacock, J.; Pila-Diez, B.; Puddu, E.; Radovich, M.; Rifatto, A.; Schneider, P.; Schrabback, T.; Sifon, C.; Sikkema, G.; Simon, P.; Sutherland, W.; Tudorica, A.; Valentijn, E.; van der Burg, R.; van Uitert, E.; van Waerbeke, L.; Velander, M.; Kleijn, G. V.; Viola, M.; Vriend, W.-J.

    2013-12-01

    The Kilo-Degree Survey (KiDS), a 1500-square-degree optical imaging survey with the recently commissioned OmegaCAM wide-field imager on the VLT Survey Telescope (VST), is described. KiDS will image two fields in u-,g-,r- and i-bands and, together with the VIKING survey, produce nine-band (u- to K-band) coverage over two fields. For the foreseeable future the KiDS/VIKING combination of superb image quality with wide wavelength coverage will be unique for surveys of its size and depth. The survey has been designed to tackle some of the most fundamental questions of cosmology and galaxy formation of today. The main science driver is mapping the dark matter distribution in the Universe and putting constraints on the expansion of the Universe and the equation of state of dark energy, all through weak gravitational lensing. However, the deep and wide imaging data will facilitate a wide variety of science cases.

  19. LSST Camera Optics Design

    Microsoft Academic Search

    V J Riot; S Olivier; B Bauman; S Pratuch; L Seppala; D Gilmore; J Ku; M Nordby; M Foss; P Antilogus; N Morgado

    2012-01-01

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope

  20. Virtual Conferencing Technologies: A survey of users Report of a survey on the use of Access Grid technologies

    E-print Network

    Hickman, Mark

    participated in the piloting & testing stages of the survey. Gratitude is expressed towards members................................................................................................. 13 Design of Questionnaire......................................................................................................... 16 Demographic Characteristics ................................................................18 Age