Science.gov

Sample records for quantitative survey design

  1. Telephone Survey Designs.

    ERIC Educational Resources Information Center

    Casady, Robert J.

    The concepts, definitions, and notation that have evolved with the development of telephone survey design methodology are discussed and presented as a unified structure. This structure is then applied to some of the more well-known telephone survey designs and alternative designs are developed. The relative merits of the different survey designs

  2. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  3. MARSAME Develop A Survey Design 4 DEVELOP A SURVEY DESIGN

    E-print Network

    MARSAME Develop A Survey Design 4 DEVELOP A SURVEY DESIGN 4.1 Introduction Once a decision rule has been developed, a disposition survey can be designed for the impacted materials and equipment (M costly and time-consuming development of redundant survey designs. The evaluation of existing SOPs

  4. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  5. Designing Surveys for HCI Research

    E-print Network

    Cortes, Corinna

    Designing Surveys for HCI Research Hendrik M¨uller Google Australia Pty Ltd. 5/48 Pirrama Road.1145/2702613.2706683 Abstract Online surveys are widely used in human-computer interaction (HCI) to gather feedback and measure satisfaction; at a glance many tools are available and the cost of conducting surveys appears low. However

  6. Report on Solar Water Heating Quantitative Survey

    SciTech Connect

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  7. TRANSPORTATION TOMORROW SURVEY DESIGN AND CONDUCT OF THE SURVEY

    E-print Network

    Toronto, University of

    TRANSPORTATION TOMORROW SURVEY 2006 DESIGN AND CONDUCT OF THE SURVEY #12;TRANSPORTATION TOMORROW SURVEY 2006 A Telephone Interview Survey on Household Travel Behaviour in the Greater Toronto with Extensions into the Winter of 2006 and the Spring of 2007 DESIGN AND CONDUCT OF THE SURVEY Prepared

  8. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  9. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  10. Qualities of a Psychiatric Mentor: A Quantitative Singaporean Survey

    ERIC Educational Resources Information Center

    Tor, Phern-Chern; Goh, Lee-Gan; Ang, Yong-Guan; Lim, Leslie; Winslow, Rasaiah-Munidasa; Ng, Beng-Yeong; Wong, Sze-Tai; Ng, Tse-Pin; Kia, Ee-Heok

    2011-01-01

    Objective: Psychiatric mentors are an important part of the new, seamless training program in Singapore. There is a need to assess the qualities of a good psychiatric mentor vis-a-vis those of a good psychiatrist. Method: An anonymous survey was sent out to all psychiatry trainees and psychiatrists in Singapore to assess quantitatively the…

  11. DESIGN AND CONDUCT OF THE SURVEY TRANSPORTATION TOMORROW SURVEY

    E-print Network

    Toronto, University of

    2011 TTS DESIGN AND CONDUCT OF THE SURVEY #12;i TRANSPORTATION TOMORROW SURVEY 2011 A Telephone Interview Survey on Household Travel Behaviour in the Greater Toronto and the Surrounding Areas Conducted in the Fall of 2011 and 2012 DESIGN AND CONDUCT OF THE SURVEY Prepared for the Transportation Information

  12. TRANSPORTATION TOMORROW SURVEY DESIGN AND CONDUCT OF THE SURVEY

    E-print Network

    Toronto, University of

    TRANSPORTATION TOMORROW SURVEY 1996 DESIGN AND CONDUCT OF THE SURVEY FIRST REPORT OF THE 1996 SERIES #12;TRANSPORTATION TOMORROW SURVEY 1996 A Telephone Interview Survey on Household Travel Behaviour AND CONDUCT OF THE SURVEY Prepared for the Toronto Area Transportation Planning Data Collection Steering

  13. TRANSPORTATION TOMORROW SURVEY DESIGN AND CONDUCT OF THE SURVEY

    E-print Network

    Toronto, University of

    TRANSPORTATION TOMORROW SURVEY 2001 DESIGN AND CONDUCT OF THE SURVEY #12;TRANSPORTATION TOMORROW SURVEY 2001 A Telephone Interview Survey on Household Travel Behaviour in Greater Toronto OF THE SURVEY Prepared for the Transportation Information Steering Committee by the Data Management Group

  14. Watershed-based survey designs.

    PubMed

    Detenbeck, Naomi E; Cincotta, Dan; Denver, Judith M; Greenlee, Susan K; Olsen, Anthony R; Pitchford, Ann M

    2005-04-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs. PMID:15861987

  15. RESOLVE and ECO: Survey Design

    NASA Astrophysics Data System (ADS)

    Kannappan, Sheila; Moffett, Amanda J.; Norris, Mark A.; Eckert, Kathleen D.; Stark, David; Berlind, Andreas A.; Snyder, Elaine M.; Norman, Dara J.; Hoversten, Erik A.; RESOLVE team

    2016-01-01

    The REsolved Spectroscopy Of a Local VolumE (RESOLVE) survey is a volume-limited census of stellar, gas, and dynamical mass as well as star formation and galaxy interactions within >50,000 cubic Mpc of the nearby cosmic web, reaching down to dwarf galaxies of baryonic mass ~10^9 Msun and spanning multiple large-scale filaments, walls, and voids. RESOLVE is surrounded by the ~10x larger Environmental COntext (ECO) catalog, with matched custom photometry and environment metrics enabling analysis of cosmic variance with greater statistical power. For the ~1500 galaxies in its two equatorial footprints, RESOLVE goes beyond ECO in providing (i) deep 21cm data with adaptive sensitivity ensuring HI mass detections or upper limits <10% of the stellar mass and (ii) 3D optical spectroscopy including both high-resolution ionized gas or stellar kinematic data for each galaxy and broad 320-725nm spectroscopy spanning [OII] 3727, Halpha, and Hbeta. RESOLVE is designed to complement other radio and optical surveys in providing diverse, contiguous, and uniform local/global environment data as well as unusually high completeness extending into the gas-dominated dwarf galaxy regime. RESOLVE also offers superb reprocessed photometry including full, deep NUV coverage and synergy with other equatorial surveys as well as unique northern and southern facilities such as Arecibo, the GBT, and ALMA. The RESOLVE and ECO surveys have been supported by funding from NSF grants AST-0955368 and OCI-1156614.

  16. Quantitative three-dimensional low-speed wake surveys

    NASA Technical Reports Server (NTRS)

    Brune, G. W.

    1992-01-01

    Theoretical and practical aspects of conducting three-dimensional wake measurements in large wind tunnels are reviewed with emphasis on applications in low-speed aerodynamics. Such quantitative wake surveys furnish separate values for the components of drag, such as profile drag and induced drag, but also measure lift without the use of a balance. In addition to global data, details of the wake flowfield as well as spanwise distributions of lift and drag are obtained. The paper demonstrates the value of this measurement technique using data from wake measurements conducted by Boeing on a variety of low-speed configurations including the complex high-lift system of a transport aircraft.

  17. 76 FR 27384 - Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    ...DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-New...Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys) Under OMB Review AGENCY: Veterans Health Administration,...

  18. A Survey of Network Design Problems

    E-print Network

    Wong, Richard T.

    Network design problems arise in many different application areas such as air freight, highway traffic, and communication systems. The intention of this survey is to present a coherent unified view of a number of papers ...

  19. A Survey of Network Design Problems

    E-print Network

    Wong, Richard T.

    This report is a survey of the design of various types of networks that frequently occur in the study of transportation and communication problems. The report contains a general framework which facilitates comparisons ...

  20. Design for manufacturability: quantitative measures for design evaluation 

    E-print Network

    Polisetty, Francis Showry Kumar

    1997-01-01

    In a design for manufacturing (DFM) approach, the designer has to consider the interactions between the various parameters in the design and the ease with which it can be manufactured, very early in the design process. This research is aimed...

  1. A quantitative and objective evaluation approach for optimal selection of design concept in conceptual design stage 

    E-print Network

    Tiwari, Sanjay

    2002-01-01

    design stage. The guideline helps in establishing quantitative measures to compare design concepts and removes the subjective nature of the concept evaluation process. A superior design is one, which is best in both functional as well as geometrical...

  2. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  3. Statistical considerations in designing raptor surveys

    USGS Publications Warehouse

    Pendleton, G.W.

    1989-01-01

    Careful sampling design is required to obtain useful estimates of raptor abundance. Well-defined objectives, selection of appropriate sample units and sampling scheme, and attention to detail to reduce extraneous sources of variability and error are all important considerations in designing a raptor survey.

  4. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  5. Ambulance Design Survey 2011: A Summary Report

    PubMed Central

    Lee, Y Tina; Kibira, Deogratias; Feeney, Allison Barnard; Marshall, Jennifer

    2013-01-01

    Current ambulance designs are ergonomically inefficient and often times unsafe for practical treatment response to medical emergencies. Thus, the patient compartment of a moving ambulance is a hazardous working environment. As a consequence, emergency medical services (EMS) workers suffer fatalities and injuries that far exceed those of the average work place in the United States. To reduce injury and mortality rates in ambulances, the Department of Homeland Security Science and Technology Directorate has teamed with the National Institute of Standards and Technology, the National Institute for Occupational Safety and Health, and BMT Designers & Planners in a joint project to produce science-based ambulance patient compartment design standards. This project will develop new crash-safety design standards and improved user-design interface guidance for patient compartments that are safer for EMS personnel and patients, and facilitate improved patient care. The project team has been working with practitioners, EMS workers’ organizations, and manufacturers to solicit needs and requirements to address related issues. This paper presents an analysis of practitioners’ concerns, needs, and requirements for improved designs elicited through the web-based survey of ambulance design, held by the National Institute of Standards and Technology. This paper also introduces the survey, analyzes the survey results, and discusses recommendations for future ambulance patient compartments design. PMID:26401439

  6. Ambulance Design Survey 2011: A Summary Report.

    PubMed

    Lee, Y Tina; Kibira, Deogratias; Feeney, Allison Barnard; Marshall, Jennifer

    2013-01-01

    Current ambulance designs are ergonomically inefficient and often times unsafe for practical treatment response to medical emergencies. Thus, the patient compartment of a moving ambulance is a hazardous working environment. As a consequence, emergency medical services (EMS) workers suffer fatalities and injuries that far exceed those of the average work place in the United States. To reduce injury and mortality rates in ambulances, the Department of Homeland Security Science and Technology Directorate has teamed with the National Institute of Standards and Technology, the National Institute for Occupational Safety and Health, and BMT Designers & Planners in a joint project to produce science-based ambulance patient compartment design standards. This project will develop new crash-safety design standards and improved user-design interface guidance for patient compartments that are safer for EMS personnel and patients, and facilitate improved patient care. The project team has been working with practitioners, EMS workers' organizations, and manufacturers to solicit needs and requirements to address related issues. This paper presents an analysis of practitioners' concerns, needs, and requirements for improved designs elicited through the web-based survey of ambulance design, held by the National Institute of Standards and Technology. This paper also introduces the survey, analyzes the survey results, and discusses recommendations for future ambulance patient compartments design. PMID:26401439

  7. Spatially balanced survey designs for natural resources

    EPA Science Inventory

    Ecological resource monitoring programs typically require the use of a probability survey design to select locations or entities to be physically sampled in the field. The ecological resource of interest, the target population, occurs over a spatial domain and the sample selecte...

  8. Despite their utility, trawl surveys can-not obtain quantitative samples from

    E-print Network

    545 Despite their utility, trawl surveys can- not obtain quantitative samples from rough, rocky sampling power of the submersible survey as a tool to discriminate density differences between trawlable habitats, and thus have a limited ability to sample all habitats representatively (Uzmann et al., 1977

  9. PH438 Survey Design and Methodology Syllabus Fall 2015 Page 1 of 5 SURVEY DESIGN AND METHODOLOGY

    E-print Network

    Contractor, Anis

    PH438 Survey Design and Methodology Syllabus Fall 2015 Page 1 of 5 SURVEY DESIGN AND METHODOLOGY, implementation, analysis, and interpretation of surveys and questionnaires in public health research. Various will emphasize hands-on experience in the design, administration, analysis and interpretation of survey data from

  10. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  11. 76 FR 9637 - Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ... Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity: Comment... outreach efforts on the prevention of suicide among Veterans and their families. DATES: Written comments...). Type of Review: New collection. Abstract: VA's top priority is the prevention of Veterans suicide....

  12. 76 FR 27384 - Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    ... Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys) Under OMB...). Type of Review: New collection. Abstract: VA's top priority is the prevention of Veterans suicide. It... better understand Veterans and their families' awareness of VA's suicide prevention and mental...

  13. Adaptive time-lapse optimized survey design for electrical resistivity tomography monitoring

    NASA Astrophysics Data System (ADS)

    Wilkinson, Paul B.; Uhlemann, Sebastian; Meldrum, Philip I.; Chambers, Jonathan E.; Carrière, Simon; Oxby, Lucy S.; Loke, M. H.

    2015-10-01

    Adaptive optimal experimental design methods use previous data and results to guide the choice and design of future experiments. This paper describes the formulation of an adaptive survey design technique to produce optimal resistivity imaging surveys for time-lapse geoelectrical monitoring experiments. These survey designs are time-dependent and, compared to dipole-dipole or static optimized surveys that do not change over time, focus a greater degree of the image resolution on regions of the subsurface that are actively changing. The adaptive optimization method is validated using a controlled laboratory monitoring experiment comprising a well-defined cylindrical target moving along a trajectory that changes its depth and lateral position. The algorithm is implemented on a standard PC in conjunction with a modified automated multichannel resistivity imaging system. Data acquisition using the adaptive survey designs requires no more time or power than with comparable standard surveys, and the algorithm processing takes place while the system batteries recharge. The results show that adaptively designed optimal surveys yield a quantitative increase in image quality over and above that produced by using standard dipole-dipole or static (time-independent) optimized surveys.

  14. Online Survey Design and Development: A Janus-Faced Approach

    ERIC Educational Resources Information Center

    Lauer, Claire; McLeod, Michael; Blythe, Stuart

    2013-01-01

    In this article we propose a "Janus-faced" approach to survey design--an approach that encourages researchers to consider how they can design and implement surveys more effectively using the latest web and database tools. Specifically, this approach encourages researchers to look two ways at once; attending to both the survey interface…

  15. Quantitative performance-based evaluation of a procedure for flexible design concept generation

    E-print Network

    Cardin, Michel-Alexandre, 1979-

    2011-01-01

    This thesis presents an experimental methodology for objective and quantitative design procedure evaluation based on anticipated lifecycle performance of design concepts, and a procedure for flexible design concept generation. ...

  16. 5 SURVEY PLANNING AND DESIGN 5.1 Introduction

    E-print Network

    5 SURVEY PLANNING AND DESIGN 5.1 Introduction This chapter is intended to assist the user in planning a strategy for conducting a final status survey, with the ultimate objective being to demonstrate compliance with the derived concentration guideline levels (DCGLs). The survey types that make up

  17. Design and Validation of the Quantum Mechanics Conceptual Survey

    ERIC Educational Resources Information Center

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included…

  18. Ordered Designs and Bayesian Inference in Survey Sampling

    E-print Network

    Meeden, Glen

    Ordered Designs and Bayesian Inference in Survey Sampling Glen Meeden School of Statistics Inference in Survey Sampling Abstract Many sampling designs, such as simple random sampling without replace the Bayesian analysis uses information that in standard frequentist methods is incorporated in the sampling

  19. Research on Basic Design Education: An International Survey

    ERIC Educational Resources Information Center

    Boucharenc, C. G.

    2006-01-01

    This paper reports on the results of a survey and qualitative analysis on the teaching of "Basic Design" in schools of design and architecture located in 22 countries. In the context of this research work, Basic Design means the teaching and learning of design fundamentals that may also be commonly referred to as the Principles of Two- and…

  20. Survey design strategies for linearized nonlinear inversion Andrew Curtis and Carl Spencer, Schlumberger

    E-print Network

    Survey design strategies for linearized nonlinear inversion Andrew Curtis and Carl Spencer, Schlumberger SUMMARY Standard nonlinear survey or experimental design criteria gen- erally contain a more robust, nonlinear design strat- egy. INTRODUCTION Many factors affect the design of a survey

  1. Designing community surveys to provide a basis for noise policy

    NASA Technical Reports Server (NTRS)

    Fields, J. M.

    1980-01-01

    After examining reports from a large number of social surveys, two areas were identified where methodological improvements in the surveys would be especially useful for public policy. The two study areas are: the definition of noise indexes and the assessment of noise impact. Improvements in the designs of surveys are recommended which would increase the validity and reliability of the noise indexes. Changes in interview questions and sample designs are proposed which would enable surveys to provide measures of noise impact which are directly relevant for public policy.

  2. Hemostatic assessment, treatment strategies, and hematology consultation in massive postpartum hemorrhage: results of a quantitative survey of obstetrician-gynecologists

    PubMed Central

    James, Andra H; Cooper, David L; Paidas, Michael J

    2015-01-01

    Objective To assess potential diagnostic and practice barriers to successful management of massive postpartum hemorrhage (PPH), emphasizing recognition and management of contributing coagulation disorders. Study design A quantitative survey was conducted to assess practice patterns of US obstetrician-gynecologists in managing massive PPH, including assessment of coagulation. Results Nearly all (98%) of the 50 obstetrician-gynecologists participating in the survey reported having encountered at least one patient with “massive” PPH in the past 5 years. Approximately half (52%) reported having previously discovered an underlying bleeding disorder in a patient with PPH, with disseminated intravascular coagulation (88%, n=23/26) being identified more often than von Willebrand disease (73%, n=19/26). All reported having used methylergonovine and packed red blood cells in managing massive PPH, while 90% reported performing a hysterectomy. A drop in blood pressure and ongoing visible bleeding were the most commonly accepted indications for rechecking a “stat” complete blood count and coagulation studies, respectively, in patients with PPH; however, 4% of respondents reported that they would not routinely order coagulation studies. Forty-two percent reported having never consulted a hematologist for massive PPH. Conclusion The survey findings highlight potential areas for improved practice in managing massive PPH, including earlier and more consistent assessment, monitoring of coagulation studies, and consultation with a hematologist. PMID:26604829

  3. PiMA Survey Design and Methodology

    E-print Network

    Mudhai, Okoth Fred; Abreu Lopes, Claudia; Mitullah, Winnie; Fraser, Alastair; Milapo, Nalukui; Mwangi, Sammy; (PI) Srinivasan, Sharath

    2015-06-23

    ., Singer, E., and Tourangeau, R. (2009), Survey Methodology. Hoboken, NJ: John Wiley and Sons. Independent Electoral Boundaries Commission (2013a), Summary of all Elective Positions in the March 4th 2013 General Election. Last update 18 July 2013...

  4. Designing occupancy studies: general advice and allocating survey effort

    USGS Publications Warehouse

    MacKenzie, D.I.; Royle, J. Andrew

    2005-01-01

    1. The fraction of sampling units in a landscape where a target species is present (occupancy) is an extensively used concept in ecology. Yet in many applications the species will not always be detected in a sampling unit even when present, resulting in biased estimates of occupancy. Given that sampling units are surveyed repeatedly within a relatively short timeframe, a number of similar methods have now been developed to provide unbiased occupancy estimates. However, practical guidance on the efficient design of occupancy studies has been lacking. 2. In this paper we comment on a number of general issues related to designing occupancy studies, including the need for clear objectives that are explicitly linked to science or management, selection of sampling units, timing of repeat surveys and allocation of survey effort. Advice on the number of repeat surveys per sampling unit is considered in terms of the variance of the occupancy estimator, for three possible study designs. 3. We recommend that sampling units should be surveyed a minimum of three times when detection probability is high (> 0.5 survey-1), unless a removal design is used. 4. We found that an optimal removal design will generally be the most efficient, but we suggest it may be less robust to assumption violations than a standard design. 5. Our results suggest that for a rare species it is more efficient to survey more sampling units less intensively, while for a common species fewer sampling units should be surveyed more intensively. 6. Synthesis and applications. Reliable inferences can only result from quality data. To make the best use of logistical resources, study objectives must be clearly defined; sampling units must be selected, and repeated surveys timed appropriately; and a sufficient number of repeated surveys must be conducted. Failure to do so may compromise the integrity of the study. The guidance given here on study design issues is particularly applicable to studies of species occurrence and distribution, habitat selection and modelling, metapopulation studies and monitoring programmes.

  5. Optimal design of focused experiments and surveys Andrew Curtis

    E-print Network

    that the quality of any particular design can be both quanti¢ed and then maximized. This study shows how. The quality of an experiment design can be de¢ned as the degree to which the experiment will satisfy someOptimal design of focused experiments and surveys Andrew Curtis Schlumberger Cambridge Research

  6. Recent advances in optimized geophysical survey design Hansruedi Maurer1

    E-print Network

    method. In contrast, an underdetermined 2D seismic traveltime tomog- raphy design study indicates subsurface information as the full bandwidth.Anon- linear experimental design for a seismic amplitudeRecent advances in optimized geophysical survey design Hansruedi Maurer1 , Andrew Curtis2

  7. Use of multispectral data in design of forest sample surveys

    NASA Technical Reports Server (NTRS)

    Titus, S. J.; Wensel, L. C.

    1977-01-01

    The use of multispectral data in design of forest sample surveys using a computer software package is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.

  8. Patchy distribution fields: a spiral survey design and reconstruction adequacy.

    PubMed

    Kalikhman, I

    2007-01-01

    A mathematical model was used to examine the effects of a spiral survey design on the adequacy of reconstructing patchy distribution fields. The model simulates fish or plankton patches (or gaps) of different shapes and spatial orientations, and acoustic surveys by a spiral of Archimedes; for comparison, surveys by parallel or zigzag transects are imitated. Adequacy of the reconstructed fields to those originally generated was evaluated by calculating their correlations (r). The mathematical experiments conducted showed that spiral surveys ensure, practically speaking, the same adequacy of field reconstruction (both in cases of immovable or movable patches) as do surveys by parallel or zigzag transects with greater sampling effort (overall path). In the case of a spiral survey, a patchy field can be reconstructed properly (r(2) = 0.70) if the overall survey path is not less than S/R(av) = 20-30, where R(av) is the autocorrelation radius averaged for various directions. Thus, the results obtained allow us to conclude that a spiral survey design is expedient in cases that the minimal duration of a survey is a decisive factor for its conduction and there is a priori information that no onshore-offshore gradients of fish density exist in a region under study. PMID:17058017

  9. Lack of quantitative training among early-career ecologists: a survey of the problem and potential solutions

    PubMed Central

    Ezard, Thomas H.G.; Jørgensen, Peter S.; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J.; Poisot, Timothée

    2014-01-01

    Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was “too low” in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862

  10. Designing surveys for tests of gravity.

    PubMed

    Jain, Bhuvnesh

    2011-12-28

    Modified gravity theories may provide an alternative to dark energy to explain cosmic acceleration. We argue that the observational programme developed to test dark energy needs to be augmented to capture new tests of gravity on astrophysical scales. Several distinct signatures of gravity theories exist outside the 'linear' regime, especially owing to the screening mechanism that operates inside halos such as the Milky Way to ensure that gravity tests in the solar system are satisfied. This opens up several decades in length scale and classes of galaxies at low redshift that can be exploited by surveys. While theoretical work on models of gravity is in the early stages, we can already identify new regimes that cosmological surveys could target to test gravity. These include: (i) a small-scale component that focuses on the interior and vicinity of galaxy and cluster halos, (ii) spectroscopy of low-redshift galaxies, especially galaxies smaller than the Milky Way, in environments that range from voids to clusters, and (iii) a programme of combining lensing and dynamical information, from imaging and spectroscopic surveys, respectively, on the same (or statistically identical) sample of galaxies. PMID:22084295

  11. A New Design for Survey Feedback.

    ERIC Educational Resources Information Center

    Alderfer, Clayton P.; Holbrook, John

    This study presents a theoretical discussion analyzing and explaining the use of group methods in feeding back diagnostic data to organizations. A new design -- the peer group-intergroup model -- is presented and compared to the traditional family group model. Data evaluating one implementation of this design showed that it was associated with…

  12. Optical Design for a Survey X-Ray Telescope

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.

    2014-01-01

    Optical design trades are underway at the Goddard Space Flight Center to define a telescope for an x-ray survey mission. Top-level science objectives of the mission include the study of x-ray transients, surveying and long-term monitoring of compact objects in nearby galaxies, as well as both deep and wide-field x-ray surveys. In this paper we consider Wolter, Wolter-Schwarzschild, and modified Wolter-Schwarzschild telescope designs as basic building blocks for the tightly nested survey telescope. Design principles and dominating aberrations of individual telescopes and nested telescopes are discussed and we compare the off-axis optical performance at 1.0 KeV and 4.0 KeV across a 1.0-degree full field-of-view.

  13. A Survey and Taxonomy of GALS Design Styles

    E-print Network

    Lemieux, Guy

    A Survey and Taxonomy of GALS Design Styles Paul Teehan, Mark Greenstreet, and Guy Lemieux find the concepts and taxonomy presented here very useful. --Sandeep Shukla, Virginia Tech Figure 1. In this article, we describe some design examples and introduce our taxonomy of these techniques. Taxonomy

  14. The Dark Energy Survey instrument design

    SciTech Connect

    Flaugher, B.; /Fermilab

    2006-05-01

    We describe a new project, the Dark Energy Survey (DES), aimed at measuring the dark energy equation of state parameter, w, to a statistical precision of {approx}5%, with four complementary techniques. The survey will use a new 3 sq. deg. mosaic camera (DECam) mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic camera, a five element optical corrector, four filters (g,r,i,z), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27''/pixel) arranged in a hexagon inscribed within the 2.2 deg. diameter field of view. We plan to use the 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). At Fermilab, we will establish a packaging factory to produce four-side buttable modules for the LBNL devices, as well as to test and grade the CCDs. R&D is underway and delivery of DECam to CTIO is scheduled for 2009.

  15. Design and Architecture of Collaborative Online Communities: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Aviv, Reuven; Erlich, Zippy; Ravid, Gilad

    2004-01-01

    This paper considers four aspects of online communities. Design, mechanisms, architecture, and the constructed knowledge. We hypothesize that different designs of communities drive different mechanisms, which give rise to different architectures, which in turn result in different levels of collaborative knowledge construction. To test this chain…

  16. Simulating future uncertainty to guide the selection of survey designs for long-term monitoring

    USGS Publications Warehouse

    Garman, Steven L.; Schweiger, E. William; Manier, Daniel J.

    2012-01-01

    A goal of environmental monitoring is to provide sound information on the status and trends of natural resources (Messer et al. 1991, Theobald et al. 2007, Fancy et al. 2009). When monitoring observations are acquired by measuring a subset of the population of interest, probability sampling as part of a well-constructed survey design provides the most reliable and legally defensible approach to achieve this goal (Cochran 1977, Olsen et al. 1999, Schreuder et al. 2004; see Chapters 2, 5, 6, 7). Previous works have described the fundamentals of sample surveys (e.g. Hansen et al. 1953, Kish 1965). Interest in survey designs and monitoring over the past 15 years has led to extensive evaluations and new developments of sample selection methods (Stevens and Olsen 2004), of strategies for allocating sample units in space and time (Urquhart et al. 1993, Overton and Stehman 1996, Urquhart and Kincaid 1999), and of estimation (Lesser and Overton 1994, Overton and Stehman 1995) and variance properties (Larsen et al. 1995, Stevens and Olsen 2003) of survey designs. Carefully planned, “scientific” (Chapter 5) survey designs have become a standard in contemporary monitoring of natural resources. Based on our experience with the long-term monitoring program of the US National Park Service (NPS; Fancy et al. 2009; Chapters 16, 22), operational survey designs tend to be selected using the following procedures. For a monitoring indicator (i.e. variable or response), a minimum detectable trend requirement is specified, based on the minimum level of change that would result in meaningful change (e.g. degradation). A probability of detecting this trend (statistical power) and an acceptable level of uncertainty (Type I error; see Chapter 2) within a specified time frame (e.g. 10 years) are specified to ensure timely detection. Explicit statements of the minimum detectable trend, the time frame for detecting the minimum trend, power, and acceptable probability of Type I error (?) collectively form the quantitative sampling objective.

  17. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more reliability and coverage to each measurement, taking advantages from the strong points of each technique.

  18. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-08-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve 10-day to 5-month sea ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett ice severity index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  19. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Ly{alpha} NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    SciTech Connect

    Prescott, Moire K. M.; Dey, Arjun; Jannuzi, Buell T.

    2012-04-01

    Giant Ly{alpha} nebulae (or Ly{alpha} 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Ly{alpha} nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Ly{alpha} nebulae at 2 {approx}< z {approx}< 3 within deep broadband imaging and have carried out a survey of the 9.4 deg{sup 2} NOAO Deep Wide-Field Survey Booetes field. With a total survey comoving volume of Almost-Equal-To 10{sup 8} h{sup -3}{sub 70} Mpc{sup 3}, this is the largest volume survey for Ly{alpha} nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Ly{alpha} nebula.

  20. Methods for Evidence-Based Practice: Quantitative Synthesis of Single-Subject Designs

    ERIC Educational Resources Information Center

    Shadish, William R.; Rindskopf, David M.

    2007-01-01

    Good quantitative evidence does not require large, aggregate group designs. The authors describe ground-breaking work in managing the conceptual and practical demands in developing meta-analytic strategies for single subject designs in an effort to add to evidence-based practice. (Contains 2 figures.)

  1. Multidisciplinary aerospace design optimization: Survey of recent developments

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1995-01-01

    The increasing complexity of engineering systems has sparked increasing interest in multidisciplinary optimization (MDO). This paper presents a survey of recent publications in the field of aerospace where interest in MDO has been particularly intense. The two main challenges of MDO are computational expense and organizational complexity. Accordingly the survey is focussed on various ways different researchers use to deal with these challenges. The survey is organized by a breakdown of MDO into its conceptual components. Accordingly, the survey includes sections on Mathematical Modeling, Design-oriented Analysis, Approximation Concepts, Optimization Procedures, System Sensitivity, and Human Interface. With the authors' main expertise being in the structures area, the bulk of the references focus on the interaction of the structures discipline with other disciplines. In particular, two sections at the end focus on two such interactions that have recently been pursued with a particular vigor: Simultaneous Optimization of Structures and Aerodynamics, and Simultaneous Optimization of Structures Combined With Active Control.

  2. Survey design and extent estimates for the National Lakes Assessment

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) conducted a National Lake Assessment (NLA) in the conterminous USA in 2007 as part of a national assessment of aquatic resources using probability based survey designs. The USEPA Office of Water led the assessment, in cooperation with...

  3. Use of multispectral data in design of forest sample surveys

    NASA Technical Reports Server (NTRS)

    Titus, S. J.; Wensel, L. C.

    1977-01-01

    The use of multispectral data in design of forest sample surveys using a computer software package, WILLIAM, is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.

  4. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    SciTech Connect

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  5. Survey Says? A Primer on Web-based Survey Design and Distribution

    PubMed Central

    Oppenheimer, Adam J.; Pannucci, Christopher J.; Kasten, Steven J.; Haase, Steven C.

    2011-01-01

    The internet has changed the way in which we gather and interpret information. While books were once the exclusive bearers of data, knowledge is now only a keystroke away. The internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over one-hundred medical publications have been based on web-based survey data alone. Due to emerging internet technologies, web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, web-based survey techniques are not without their limitations, namely recall and response biases. When used properly, however, web-based surveys can greatly simplify the research process. This article discusses the implications of web-based surveys and provides guidelines for their effective design and distribution. PMID:21701347

  6. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  7. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    SciTech Connect

    Ivezic, Z.; Axelrod, T.; Brandt, W.N.; Burke, D.L.; Claver, C.F.; Connolly, A.; Cook, K.H.; Gee, P.; Gilmore, D.K.; Jacoby, S.H.; Jones, R.L.; Kahn, S.M.; Kantor, J.P.; Krabbendam, V.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Saha, A.; Schalk, T.L.; Schneider, D.P.; Strauss, Michael A.; /Washington U., Seattle, Astron. Dept. /LSST Corp. /Penn State U., Astron. Astrophys. /KIPAC, Menlo Park /NOAO, Tucson /LLNL, Livermore /UC, Davis /Princeton U., Astrophys. Sci. Dept. /Naval Observ., Flagstaff /Arizona U., Astron. Dept. - Steward Observ. /UC, Santa Cruz /Harvard U. /Johns Hopkins U. /Illinois U., Urbana

    2011-10-14

    In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST). LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg{sup 2} field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg{sup 2} with {delta} < +34.5{sup o}, and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the LSST science drivers led to these choices of system parameters.

  8. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    It is widely accepted that design features are one of the most attractive integration method of most fields of engineering activities such as a design modelling, process planning or production scheduling. One of the most important tasks which are realized in the integration process of design and planning functions is a design translation meant as design data mapping into data which are important from process planning needs point of view, it is manufacturing data. A design geometrical shape translation process can be realized with application one of the following strategies: (i) designing with previously prepared design features library also known as DBF method it is design by feature, (ii) interactive design features recognition IFR, (iii) automatic design features recognition AFR. In case of the DBF method design geometrical shape is created with design features. There are two basic approaches for design modelling in DBF method it is classic in which a part design is modelled from beginning to end with application design features previously stored in a design features data base and hybrid where part is partially created with standard predefined CAD system tools and the rest with suitable design features. Automatic feature recognition consist in an autonomic searching of a product model represented with a specific design representation method in order to find those model features which might be potentially recognized as design features, manufacturing features, etc. This approach needs the searching algorithm to be prepared. The searching algorithm should allow carrying on the whole recognition process without a user supervision. Currently there are lots of AFR methods. These methods need the product model to be represented with B-Rep representation most often, CSG rarely, wireframe very rarely. In the IFR method potential features are being recognized by a user. This process is most often realized by a user who points out those surfaces which seem to belong to a currently identified feature. In the IFR method system designer defines a set of features and sets a collection of recognition process parameters. It allows to unambiguously identifying individual features in automatic or semiautomatic way directly in CAD system or in an external application to which the part model might be transferred. Additionally a user is able to define non-geometrical information such as: overall dimensions, surface roughness etc. In this paper a survey on methods of features identification and recognition is presented especially in context of AFR methods.

  9. Materials for Digital Optical Design:. a Survey Study

    NASA Astrophysics Data System (ADS)

    Ismail, Ayman Abdel Khader; Ismail, Imane Aly Saroit; Ahmed, S. H.

    2010-04-01

    In the last few years digital optical design had major attention in research fields. Many researches were published in the fields of optical materials, instruments, circuit design and devices. This is considered to be the most multidisciplinary field and requires for its success collaborative efforts of many disciplines, ranging from device and optical engineers to computer architects, chemists, material scientists, and optical physicists. In this study we will introduce a survey of the latest papers in the field of optical materials and its properties for light; this paper is organized in three major sections, optical glasses, compound materials and nonlinear absorption (multi photon absorption) and up-conversion.

  10. 76 FR 9637 - Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ...DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-New...Proposed Information Collection (Veteran Suicide Prevention Online Quantitative...Activity: Comment Request AGENCY: Veterans Health Administration, Department...

  11. Image resolution analysis: a new, robust approach to seismic survey design 

    E-print Network

    Tzimeas, Constantinos

    2005-08-29

    Seismic survey design methods often rely on qualitative measures to provide an optimal image of their objective target. Fold, ray tracing techniques counting ray hits on binned interfaces, and even advanced 3-D survey design methods that try...

  12. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...approval of seat belt survey design. (a) Contents...the State's seat belt survey design submitted for... (i) Define all sampling units, with their measures...procedures to adjust sampling weight for observation...1340.5(g). (b) Survey design submission...

  13. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...approval of seat belt survey design. (a) Contents...the State's seat belt survey design submitted for... (i) Define all sampling units, with their measures...procedures to adjust sampling weight for observation...1340.5(g). (b) Survey design submission...

  14. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...approval of seat belt survey design. (a) Contents...the State's seat belt survey design submitted for... (i) Define all sampling units, with their measures...procedures to adjust sampling weight for observation...1340.5(g). (b) Survey design submission...

  15. SEDS: THE SPITZER EXTENDED DEEP SURVEY. SURVEY DESIGN, PHOTOMETRY, AND DEEP IRAC SOURCE COUNTS

    SciTech Connect

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Hernquist, L.; Hora, J. L.; Arendt, R.; Barmby, P.; Barro, G.; Faber, S.; Guhathakurta, P.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Finlator, K.; Grogin, N. A.; and others

    2013-05-20

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg{sup 2} to a depth of 26 AB mag (3{sigma}) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 {mu}m. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 {+-} 1.0 and 4.4 {+-} 0.8 nW m{sup -2} sr{sup -1} at 3.6 and 4.5 {mu}m to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  16. SEDS: The Spitzer Extended Deep Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Technical Reports Server (NTRS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Arendt, A.; Barmby, P.; Barro, G; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Faber, S.; Finlator, K.; Grogin, N. A.; Guhathakurta, P.; Hernquist, L.; Hora, J. L.; Illingworth, G.; Kashlinsky, A; Koekmoer, A. M.; Koo, D. C.; Moseley, H.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg(exp 2) to a depth of 26 AB mag (3sigma) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 micron. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 +/- 1.0 and 4.4 +/- 0.8 nW / square m/sr at 3.6 and 4.5 micron to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  17. Design Considerations: Falcon M Dwarf Habitable Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Polsgrove, Daniel; Novotny, Steven; Della-Rose, Devin J.; Chun, Francis; Tippets, Roger; O'Shea, Patrick; Miller, Matthew

    2016-01-01

    The Falcon Telescope Network (FTN) is an assemblage of twelve automated 20-inch telescopes positioned around the globe, controlled from the Cadet Space Operations Center (CSOC) at the US Air Force Academy (USAFA) in Colorado Springs, Colorado. Five of the 12 sites are currently installed, with full operational capability expected by the end of 2016. Though optimized for studying near-earth objects to accomplish its primary mission of Space Situational Awareness (SSA), the Falcon telescopes are in many ways similar to those used by ongoing and planned exoplanet transit surveys targeting individual M dwarf stars (e.g., MEarth, APACHE, SPECULOOS). The network's worldwide geographic distribution provides additional potential advantages. We have performed analytical and empirical studies exploring the viability of employing the FTN for a future survey of nearby late-type M dwarfs tailored to detect transits of 1-2REarth exoplanets in habitable-zone orbits . We present empirical results on photometric precision derived from data collected with multiple Falcon telescopes on a set of nearby (< 25 pc) M dwarfs using infrared filters and a range of exposure times, as well as sample light curves created from images gathered during known transits of varying transit depths. An investigation of survey design parameters is also described, including an analysis of site-specific weather data, anticipated telescope time allocation and the percentage of nearby M dwarfs with sufficient check stars within the Falcons' 11' x 11' field-of-view required to perform effective differential photometry. The results of this ongoing effort will inform the likelihood of discovering one (or more) habitable-zone exoplanets given current occurrence rate estimates over a nominal five-year campaign, and will dictate specific survey design features in preparation for initiating project execution when the FTN begins full-scale automated operations.

  18. New facility design and work method for the quantitative fit testing laboratory. Master's thesis

    SciTech Connect

    Ward, G.F.

    1989-05-01

    The United States Air Force School of Aerospace Medicine (USAFSAM) tests the quantitative fit of masks which are worn by military personnel during nuclear, biological, and chemical warfare. Subjects are placed in a Dynatech-Frontier Fit Testing Chamber, salt air is fed into the chamber, and samples of air are drawn from the mask and the chamber. The ratio of salt air outside the mask to salt air inside the mask is called the quantitative fit factor. A motion-time study was conducted to evaluate the efficiency of the layout and work method presently used in the laboratory. A link analysis was done to determine equipment priorities, and the link data and design guidelines were used to develop three proposed laboratory designs. The proposals were evaluated by projecting the time and motion efficiency, and the energy expended working in each design. Also evaluated were the lengths of the equipment links for each proposal, and each proposal's adherence to design guidelines. A mock-up was built of the best design proposal, and a second motion-time study was run. Results showed that with the new laboratory and work procedures, the USAFSAM analyst could test 116 more subjects per year than are currently tested. Finally, the results of a questionnaire given to the analyst indicated that user acceptance of the work area improved with the new design.

  19. The Large Synoptic Survey Telescope concept design overview

    NASA Astrophysics Data System (ADS)

    Krabbendam, Victor L.

    2008-07-01

    The Large Synoptic Survey Telescope Project is a public-private partnership that has successfully completed the Concept Design of its wide-field ground based survey system and started several long-lead construction activities using private funding. The telescope has a 3-mirror wide field optical system with an 8.4 meter primary, 3.4 meter secondary, and 5 meter tertiary mirror. The reflective optics feed three refractive elements and a 64 cm 3.2 gigapixel camera. The telescope will be located on the summit of Cerro Pachón in Chile. The LSST data management system will reduce, transport, alert, archive the roughly 15 terabytes of data produced nightly, and will serve the raw and catalog data accumulating at an average of 7 petabytes per year to the community without any proprietary period. This survey will yield contiguous overlapping imaging of 20,000 square degrees of sky in 6 optical filter bands covering wavelengths from 320 to 1080nm. The project continues to attract institutional partners and has acquired non-federal funding sufficient to construct the primary mirror, already in progress at the University of Arizona, and fund detector prototype efforts, two of the longest lead items in the LSST. The project has submitted a proposal for construction to the National Science Foundation Major Research Equipment and Facilities Construction (MREFC) program and is preparing for a 2011 funding authorization.

  20. Quantitative Survey and Structural Classification of Fracking Chemicals Reported in Unconventional Gas Exploitation

    NASA Astrophysics Data System (ADS)

    Elsner, Martin; Schreglmann, Kathrin

    2015-04-01

    Few technologies are being discussed in such controversial terms as hydraulic fracturing ("fracking") in the recovery of unconventional gas. Particular concern regards the chemicals that may return to the surface as a result of hydraulic fracturing. These are either "fracking chemicals" - chemicals that are injected together with the fracking fluid to optimize the fracturing performance or geogenic substances which may turn up during gas production, in the so-called produced water originating from the target formation. Knowledge about them is warranted for several reasons. (1) Monitoring. Air emissions are reported to arise from well drilling, the gas itself or condensate tanks. In addition, potential spills and accidents bear the danger of surface and shallow groundwater contaminations. Monitoring strategies are therefore warranted to screen for "indicator" substances of potential impacts. (2) Chemical Analysis. To meet these analytical demands, target substances must be defined so that adequate sampling approaches and analytical methods can be developed. (3) Transformation in the Subsurface. Identification and classification of fracking chemicals (aromatics vs. alcohols vs. acids, esters, etc.) is further important to assess the possibility of subsurface reactions which may potentially generate new, as yet unidentified transformation products. (4) Wastewater Treatment. For the same reason chemical knowledge is important for optimized wastewater treatment strategies. (5) Human and Ecosystem Health. Knowledge of the most frequent fracking chemicals is further essential for risk assessment (environmental behavior, toxicity) (6) Public Discussions. Finally, an overview of reported fracking chemicals can provide unbiased scientific into current public debates and enable critical reviews of Green Chemistry approaches. Presently, however, such information is not readily available. We aim to close this knowledge gap by providing a quantitative overview of chemical additives reported for use in hydraulic fracturing. For the years 2005-2009 it is based on the Waxman report, and for the years 2011-2013 it relies on the database FracFocus, where it makes use of the data extracted and provided by the website "SkyTruth". For the first time, we list fracking chemicals according to their chemical structure and functional groups, because these properties are important as a starting point for (i) the design of analytical methods, (ii) to assess environmental fate and (iii) to understand why a given chemical is used at a certain stage of the fracturing process and what possible alternatives exist.

  1. The Large Synoptic Survey Telescope preliminary design overview

    NASA Astrophysics Data System (ADS)

    Krabbendam, V. L.; Sweeney, D.

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) Project is a public-private partnership that is well into the design and development of the complete observatory system to conduct a wide fast deep survey and to process and serve the data. The telescope has a 3-mirror wide field optical system with an 8.4 meter primary, 3.4 meter secondary, and 5 meter tertiary mirror. The reflective optics feed three refractive elements and a 64 cm 3.2 gigapixel camera. The LSST data management system will reduce, transport, alert and archive the roughly 15 terabytes of data produced nightly, and will serve the raw and catalog data accumulating at an average of 7 petabytes per year to the community without any proprietary period. The project has completed several data challenges designed to prototype and test the data management system to significant pre-construction levels. The project continues to attract institutional partners and has acquired non-federal funding sufficient to construct the primary mirror, already in progress at the University of Arizona, build the secondary mirror substrate, completed by Corning, and fund detector prototype efforts, several that have been tested on the sky. A focus of the project is systems engineering, risk reduction through prototyping and major efforts in image simulation and operation simulations. The project has submitted a proposal for construction to the National Science Foundation Major Research Equipment and Facilities Construction (MREFC) program and has prepared project advocacy papers for the National Research Council's Astronomy 2010 Decadal Survey. The project is preparing for a 2012 construction funding authorization.

  2. Designing Questionnaire Items: Lessons Learned from Faculty and Student Surveys.

    ERIC Educational Resources Information Center

    Meld, Andrea

    Surveys used for program and institutional evaluation, such as self-studies conducted for accreditation review, are discussed. Frequently, these evaluations take the form of faculty surveys and student surveys. This paper explores the following general considerations associated with mail surveys and other surveys: avoidance of response bias;…

  3. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…

  4. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Submission and approval of seat belt survey design. 1340... TRANSPORTATION UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.10 Submission and approval of seat belt survey design. (a) Contents: The following...

  5. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Submission and approval of seat belt survey design. 1340... TRANSPORTATION UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.10 Submission and approval of seat belt survey design. (a) Contents: The following...

  6. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Submission and approval of seat belt survey design. 1340... TRANSPORTATION UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.10 Submission and approval of seat belt survey design. (a) Contents: The following...

  7. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  8. Antibody Drug Conjugates: Application of Quantitative Pharmacology in Modality Design and Target Selection.

    PubMed

    Sadekar, S; Figueroa, I; Tabrizi, M

    2015-07-01

    Antibody drug conjugates (ADCs) are a multi-component modality comprising of an antibody targeting a cell-specific antigen, a potent drug/payload, and a linker that can be processed within cellular compartments to release payload upon internalization. Numerous ADCs are being evaluated in both research and clinical settings within the academic and pharmaceutical industry due to their ability to selectively deliver potent payloads. Hence, there is a clear need to incorporate quantitative approaches during early stages of drug development for effective modality design and target selection. In this review, we describe a quantitative approach and framework for evaluation of the interplay between drug- and systems-dependent properties (i.e., target expression, density, localization, turnover, and affinity) in order to deliver a sufficient amount of a potent payload into the relevant target cells. As discussed, theoretical approaches with particular considerations given to various key properties for the target and modality suggest that delivery of the payload into particular effect cells to be more sensitive to antigen concentrations for targets with slow turnover rates as compared to those with faster internalization rates. Further assessments also suggest that increasing doses beyond the threshold of the target capacity (a function of target internalization and expression) may not impact the maximum amount of payload delivered to the intended effect cells. This article will explore the important application of quantitative sciences in selection of the target and design of ADC modalities. PMID:25933599

  9. Rotorcraft flight control design using quantitative feedback theory and dynamic crossfeeds

    NASA Technical Reports Server (NTRS)

    Cheng, Rendy P.

    1995-01-01

    A multi-input, multi-output controls design with robust crossfeeds is presented for a rotorcraft in near-hovering flight using quantitative feedback theory (QFT). Decoupling criteria are developed for dynamic crossfeed design and implementation. Frequency dependent performance metrics focusing on piloted flight are developed and tested on 23 flight configurations. The metrics show that the resulting design is superior to alternative control system designs using conventional fixed-gain crossfeeds and to feedback-only designs which rely on high gains to suppress undesired off-axis responses. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets current handling qualities specifications relative to the decoupling of off-axis responses. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensator successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective.

  10. Surveys of Need for Office Design and Other Design Skills Among Former Interior Design Graduates and Employers of Designers. Volume IX, No. 15.

    ERIC Educational Resources Information Center

    Daly, Pat; Lucas, John A.

    In order to measure the demand for an office design course among interior design graduates of Harper College and to determine the design skill needs of area employers, two studies were undertaken. An initial mail survey, a second mailing, and a series of telephone calls yielded a 75% response rate from the program's 88 graduates. Of the…

  11. Curriculum Design of Computer Graphics Programs: A Survey of Art/Design Programs at the University Level.

    ERIC Educational Resources Information Center

    McKee, Richard Lee

    This master's thesis reports the results of a survey submitted to over 30 colleges and universities that currently offer computer graphics courses or are in the planning stage of curriculum design. Intended to provide a profile of the computer graphics programs and insight into the process of curriculum design, the survey gathered data on program…

  12. The SCUBA Half-Degree Extragalactic Survey - I. Survey motivation, design and data processing

    NASA Astrophysics Data System (ADS)

    Mortier, A. M. J.; Serjeant, S.; Dunlop, J. S.; Scott, S. E.; Ade, P.; Alexander, D.; Almaini, O.; Aretxaga, I.; Baugh, C.; Benson, A. J.; Best, P. N.; Blain, A.; Bock, J.; Borys, C.; Bressan, A.; Carilli, C.; Chapin, E. L.; Chapman, S.; Clements, D. L.; Coppin, K.; Crawford, M.; Devlin, M.; Dicker, S.; Dunne, L.; Eales, S. A.; Edge, A. C.; Farrah, D.; Fox, M.; Frenk, C.; Gaztañaga, E.; Gear, W. K.; Gonzales-Solares, E.; Granato, G. L.; Greve, T. R.; Grimes, J. A.; Gundersen, J.; Halpern, M.; Hargrave, P.; Hughes, D. H.; Ivison, R. J.; Jarvis, M. J.; Jenness, T.; Jimenez, R.; van Kampen, E.; King, A.; Lacey, C.; Lawrence, A.; Lepage, K.; Mann, R. G.; Marsden, G.; Mauskopf, P.; Netterfield, B.; Oliver, S.; Olmi, L.; Page, M. J.; Peacock, J. A.; Pearson, C. P.; Percival, W. J.; Pope, A.; Priddey, R. S.; Rawlings, S.; Roche, N.; Rowan-Robinson, M.; Scott, D.; Sekiguchi, K.; Seigar, M.; Silva, L.; Simpson, C.; Smail, I.; Stevens, J. A.; Takagi, T.; Tucker, G.; Vlahakis, C.; Waddington, I.; Wagg, J.; Watson, M.; Willott, C.; Vaccari, M.

    2005-10-01

    The Submillimetre Common-User Bolometer Array (SCUBA) Half-Degree Extragalactic Survey (SHADES) is a major new blank-field extragalactic submillimetre (submm) survey currently underway at the James Clerk Maxwell Telescope (JCMT). Ultimately, SHADES aims to cover half a square degree at 450 and 850?m to a 4? depth of ~= 8mJy at 850?m. Two fields are being observed, the Subaru/XMM-Newton Deep Field (SXDF) (02h18m- 05°) and the Lockman Hole East (10h52m+ 57°). The survey has three main aims: (i) to investigate the population of high-redshift submm galaxies and the cosmic history of massive dust-enshrouded star formation activity; (ii) to investigate the clustering properties of submm-selected galaxies in order to determine whether these objects could be progenitors of present-day massive ellipticals; and (iii) to investigate the fraction of submm-selected sources that harbour active galactic nuclei. To achieve these aims requires that the submm data be combined with co-spatial information spanning the radio-to-X-ray frequency range. Accordingly, SHADES has been designed to benefit from ultra-deep radio imaging obtained with the Very Large Array (VLA), deep mid-infrared observations from the Spitzer Space Telescope, submm mapping by the Balloon-borne Large Aperture Submillimetre Telescope (BLAST), deep near-infrared imaging with the United Kingdom Infrared Telescope, deep optical imaging with the Subaru Telescope and deep X-ray observations with the XMM-Newton observatory. It is expected that the resulting extensive multiwavelength data set will provide complete photometric redshift information accurate to as well as detailed spectral energy distributions for the vast majority of the submm-selected sources. In this paper, the first of a series on SHADES, we present an overview of the motivation for the survey, describe the SHADES survey strategy, provide a detailed description of the primary data-analysis pipeline and demonstrate the superiority of our adopted matched-filter source-extraction technique over, for example, Emerson-II style methods. We also report on the progress of the survey. As of 2004 February, 720arcmin2 had been mapped with SCUBA (about 40 per cent of the anticipated final total area) to a median 1? depth of 2.2mJy per beam at 850?m (25mJy per beam at 450?m), and the source-extraction routines give a source density of 650 +/- 50 sources deg-2 > 3? at 850?m. Although uncorrected for Eddington bias, this source density is more than sufficient for providing enough sources to answer the science goals of SHADES, once half a square degree is observed. A refined reanalysis of the original 8-mJy survey Lockman hole data was carried out in order to evaluate the new data-reduction pipeline. Of the 17 most secure sources in the original sample, 12 have been reconfirmed, including 10 of the 11 for which radio identifications were previously secured.

  13. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.11..., sample design, seat belt use rate estimation method, variance estimation method and data...

  14. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.11..., sample design, seat belt use rate estimation method, variance estimation method and data...

  15. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... UNIFORM CRITERIA FOR STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Administrative Requirements § 1340.11..., sample design, seat belt use rate estimation method, variance estimation method and data...

  16. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  17. Quantitative assessment of climate change and human activities impact on the designed annual runoff

    NASA Astrophysics Data System (ADS)

    Hu, Yiming; Liang, Zhongmin; Liu, Yongwei

    2015-04-01

    In recent years, more and more researchers study the impact of climate change and human activities on runoff in the flood context. In this study, we propose a novel statistical method to quantitatively analyze the contribution of climate change and human activities to runoff change. The method is based on the assumption that if a given x-year designed precipitation is input to the hydrological model, the return period of corresponding output runoff also is x-year. The assumption has been widely used in the hydrological field when precipitation data is used to estimate the deigned flood with a given horizon. Compared to most of the current studies using the hydrological model to simulate the change, the proposed method needs less data, which makes it easy to implement. The method is employed to analyze the impact of climate change and human activities on different- designed-horizon annual runoff in the upper basin of Tangnaihai station. The M-K test result shows that the annual runoff series has the decreasing trend. The quantitative impact assessment results show that in terms of 1000-, 100-, and 50-year return period, the designed annual runoff after 1989 reduced by 24.8%, 24.6% and 24.2% respectively compared to that before 1989. The climate change accounts for 71.1%, 65.7% and 63.2% of the decrease of 1000-, 100-, and 50-year designed annual runoff respectively, while the human activities account for 28.9%, 34.3% and 36.8% respectively. Overall, the impact of climate change on annual runoff is higher than that of human activities. Keywords: annual runoff; climate change; human activities; impact assessment

  18. Measuring access to medicines: a review of quantitative methods used in household surveys

    PubMed Central

    2010-01-01

    Background Medicine access is an important goal of medicine policy; however the evaluation of medicine access is a subject under conceptual and methodological development. The aim of this study was to describe quantitative methodologies to measure medicine access on household level, access expressed as paid or unpaid medicine acquisition. Methods Searches were carried out in electronic databases and health institutional sites; within references from retrieved papers and by contacting authors. Results Nine papers were located. The methodologies of the studies presented differences in the recall period, recruitment of subjects and medicine access characterization. Conclusions The standardization of medicine access indicators and the definition of appropriate recall periods are required to evaluate different medicines and access dimensions, improving studies comparison. Besides, specific keywords must be established to allow future literature reviews about this topic. PMID:20509960

  19. Quantitatively structural control of the karst based on speleological cave survey data: Cabeza Llerosos massif (Picos de Europa, Spain)

    NASA Astrophysics Data System (ADS)

    Ballesteros, D.; Jiménez-Sánchez, M.; García-Sansegundo, J.; Borreguero, M.; Sendra, G.

    2012-04-01

    Speleological cave survey characterizes each cave passage by a 3D line (called shot survey) defined by its length, direction and dipping. This line represents the three-dimensional geometry of the karst system and cave passage scale and can be statistically analyzed and compared with the geometry of the massif discontinuities. The aim of this work is to establish the quantitative influence of the structural geology in caves based on the comparison between cave survey data, joint and bedding measurements with stereographic projection. 15 km of cave surveys from Cabeza Llerosos massif (Picos de Europa, Northern Spain) were chosen to illustrate the method. The length of the cavities range between 50 to 4,438 m and their depth is up to 738 m. The methodology of work includes: 1) cave survey collection from caving reports; 2) geological mapping and cross-sections with cavities projection;3) data collection of bedding and joints in caves and near outcrops;4) definition of families of joints and bedding planes by stereographic projection; 5) definition of groups of cave passages from stereographic projection (based on their directions and dipping) and 6) comparison between bedding, families of joints and cave survey data by stereographic projection. Seven families of joints have been defined in all the area of study. The joint families are: J1) sub-vertical, J2) N63/68SE, J3) N29E/46NW, J4) N52E/72NW, J5) N129E/17NE, J6) N167E/57NE and J7) N180E/26E; the bedding is N30-55/60-80NE. Five groups of cave passages have been defined. "A" group of cave passage is formed by sub-vertical series; it is represented by the 61 % of all the cave passages and is conditioned by the joint families J1, J3, J4 and J6, as well as their intersections. "B" group is formed by N10W-N10E/3-20N galleries; it corresponds with the 13 % of the series and is controlled by the intersection between families J5 and J6. "C" group is defined by N20-70E/0-50NE passages; it is represented by the 13 % of the cavities and is ruled by the intersection between families J1, J2, J5 and J7. "D" group is formed by N125-145E horizontal galleries; it includes the 6 % of the passages and follows the bedding. "E" group is defined by N105-151W/38-65SW passages; it is represented by the 3 % of the passages and is conditioned by the families J1, J2, J3 and J4. This work proposes a new methodology of work in speleogenesis based on the establishment of the quantitative relationships between cave survey with joints and bedding. The method shows some advantages when compared with other methodologies (e.g. statistically models based on structural analysis of the discontinuities of the rock massif or based on the Inception Horizon concept): it is a 3D analysis that can be applied on complex geological settings to make accurate estimations of the percentage of the caves controlled by each joint family and bedding.

  20. The Health Effects of Climate Change: A Survey of Recent Quantitative Research

    PubMed Central

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-01-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases. PMID:22754455

  1. The health effects of climate change: a survey of recent quantitative research.

    PubMed

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-05-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases. PMID:22754455

  2. Trajectory Design for the Transiting Exoplanet Survey Satellite

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel J. K.; Williams, Trevor W.; Mendelsohn, Chad R.

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission, scheduled to be launched in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the Schematics Window Methodology (SWM76) launch window analysis tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements. Keywords: resonant orbit, stability, lunar flyby, phasing loops, trajectory optimization

  3. Trajectory Design for the Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel; Williams, Trevor; Mendelsohn, Chad

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission launching in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the SWM76 launch window tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements.

  4. National Aquatic Resource Surveys: Integration of Geospatial Data in Their Survey Design and Analysis

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are a series of four statistical surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams...

  5. Professional values and reported behaviours of doctors in the USA and UK: quantitative survey

    PubMed Central

    Rao, Sowmya R; Sibbald, Bonnie; Hann, Mark; Harrison, Stephen; Walter, Alex; Guthrie, Bruce; Desroches, Catherine; Ferris, Timothy G; Campbell, Eric G

    2011-01-01

    Background The authors aimed to determine US and UK doctors' professional values and reported behaviours, and the extent to which these vary with the context of care. Method 1891 US and 1078 UK doctors completed the survey (64.4% and 40.3% response rate respectively). Multivariate logistic regression was used to compare responses to identical questions in the two surveys. Results UK doctors were more likely to have developed practice guidelines (82.8% UK vs 49.6% US, p<0.001) and to have taken part in a formal medical error-reduction programme (70.9% UK vs 55.7% US, p<0.001). US doctors were more likely to agree about the need for periodic recertification (completely agree 23.4% UK vs 53.9% US, p<0.001). Nearly a fifth of doctors had direct experience of an impaired or incompetent colleague in the previous 3?years. Where the doctor had not reported the colleague to relevant authorities, reasons included thinking that someone else was taking care of the problem, believing that nothing would happen as a result, or fear of retribution. UK doctors were more likely than US doctors to agree that significant medical errors should always be disclosed to patients. More US doctors reported that they had not disclosed an error to a patient because they were afraid of being sued. Discussion The context of care may influence both how professional values are expressed and the extent to which behaviours are in line with stated values. Doctors have an important responsibility to develop their healthcare systems in ways which will support good professional behaviour. PMID:21383386

  6. Professional values and reported behaviours of doctors in the USA and UK: quantitative survey.

    PubMed

    Roland, Martin; Rao, Sowmya R; Sibbald, Bonnie; Hann, Mark; Harrison, Stephen; Walter, Alex; Guthrie, Bruce; Desroches, Catherine; Ferris, Timothy G; Campbell, Eric G

    2011-06-01

    BACKGROUND The authors aimed to determine US and UK doctors' professional values and reported behaviours, and the extent to which these vary with the context of care. METHOD 1891 US and 1078 UK doctors completed the survey (64.4% and 40.3% response rate respectively). Multivariate logistic regression was used to compare responses to identical questions in the two surveys. RESULTS UK doctors were more likely to have developed practice guidelines (82.8% UK vs 49.6% US, p<0.001) and to have taken part in a formal medical error-reduction programme (70.9% UK vs 55.7% US, p<0.001). US doctors were more likely to agree about the need for periodic recertification (completely agree 23.4% UK vs 53.9% US, p<0.001). Nearly a fifth of doctors had direct experience of an impaired or incompetent colleague in the previous 3 years. Where the doctor had not reported the colleague to relevant authorities, reasons included thinking that someone else was taking care of the problem, believing that nothing would happen as a result, or fear of retribution. UK doctors were more likely than US doctors to agree that significant medical errors should always be disclosed to patients. More US doctors reported that they had not disclosed an error to a patient because they were afraid of being sued. DISCUSSION The context of care may influence both how professional values are expressed and the extent to which behaviours are in line with stated values. Doctors have an important responsibility to develop their healthcare systems in ways which will support good professional behaviour. PMID:21383386

  7. Quantitative imaging of the human upper airway: instrument design and clinical studies

    NASA Astrophysics Data System (ADS)

    Leigh, M. S.; Armstrong, J. J.; Paduch, A.; Sampson, D. D.; Walsh, J. H.; Hillman, D. R.; Eastwood, P. R.

    2006-08-01

    Imaging of the human upper airway is widely used in medicine, in both clinical practice and research. Common imaging modalities include video endoscopy, X-ray CT, and MRI. However, no current modality is both quantitative and safe to use for extended periods of time. Such a capability would be particularly valuable for sleep research, which is inherently reliant on long observation sessions. We have developed an instrument capable of quantitative imaging of the human upper airway, based on endoscopic optical coherence tomography. There are no dose limits for optical techniques, and the minimally invasive imaging probe is safe for use in overnight studies. We report on the design of the instrument and its use in preliminary clinical studies, and we present results from a range of initial experiments. The experiments show that the instrument is capable of imaging during sleep, and that it can record dynamic changes in airway size and shape. This information is useful for research into sleep disorders, and potentially for clinical diagnosis and therapies.

  8. Career Plans Survey Report PennDesign Class of 2010

    E-print Network

    Plotkin, Joshua B.

    * Sunlay Design, Architecture, Associate Architect, Beijing, China TEN Arquitectos, Design, Junior, Beijing, China McGillin Architecture, Inc., Architecture, Intern Architect, Bala Cynwyd, PA Pennsylvania Parttime Employment Moore College of Art & Design, Graduate Program Interior Design, Adjunct Professor

  9. Controls design with crossfeeds for hovering rotorcraft using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Biezad, Daniel J.; Cheng, Rendy

    1996-01-01

    A multi-input, multi-output controls design with dynamic crossfeed pre-compensation is presented for rotorcraft in near-hovering flight using Quantitative Feedback Theory (QFT). The resulting closed-loop control system bandwidth allows the rotorcraft to be considered for use as an inflight simulator. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets most handling qualities specifications relative to the decoupling of off-axis responses. Handling qualities are Level 1 for both low-gain tasks and high-gain tasks in the roll, pitch, and yaw axes except for the 10 deg/sec moderate-amplitude yaw command where the rotorcraft exhibits Level 2 handling qualities in the yaw axis caused by phase lag. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensators successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective. This is an area to be investigated in future research.

  10. Hydrological drought types in cold climates: quantitative analysis of causing factors and qualitative survey of impacts

    NASA Astrophysics Data System (ADS)

    Van Loon, A. F.; Ploum, S. W.; Parajka, J.; Fleig, A. K.; Garnier, E.; Laaha, G.; Van Lanen, H. A. J.

    2015-04-01

    For drought management and prediction, knowledge of causing factors and socio-economic impacts of hydrological droughts is crucial. Propagation of meteorological conditions in the hydrological cycle results in different hydrological drought types that require separate analysis. In addition to the existing hydrological drought typology, we here define two new drought types related to snow and ice. A snowmelt drought is a deficiency in the snowmelt discharge peak in spring in snow-influenced basins and a glaciermelt drought is a deficiency in the glaciermelt discharge peak in summer in glacierised basins. In 21 catchments in Austria and Norway we studied the meteorological conditions in the seasons preceding and at the time of snowmelt and glaciermelt drought events. Snowmelt droughts in Norway were mainly controlled by below-average winter precipitation, while in Austria both temperature and precipitation played a role. For glaciermelt droughts, the effect of below-average summer air temperature was dominant, both in Austria and Norway. Subsequently, we investigated the impacts of temperature-related drought types (i.e. snowmelt and glaciermelt drought, but also cold and warm snow season drought and rain-to-snow-season drought). In historical archives and drought databases for the US and Europe many impacts were found that can be attributed to these temperature-related hydrological drought types, mainly in the agriculture and electricity production (hydropower) sectors. However, drawing conclusions on the frequency of occurrence of different drought types from reported impacts is difficult, mainly because of reporting biases and the inevitably limited spatial and temporal scales of the information. Finally, this study shows that complete integration of quantitative analysis of causing factors and qualitative analysis of impacts of temperature-related droughts is not yet possible. Analysis of selected events, however, points out that it can be a promising research area if more data on drought impacts become available.

  11. THE IMACS CLUSTER BUILDING SURVEY. V. FURTHER EVIDENCE FOR STARBURST RECYCLING FROM QUANTITATIVE GALAXY MORPHOLOGIES

    SciTech Connect

    Abramson, Louis E.; Gladders, Michael D.; Dressler, Alan; Oemler, Augustus Jr.; Monson, Andrew; Persson, Eric; Poggianti, Bianca M.; Vulcani, Benedetta

    2013-11-10

    Using J- and K{sub s}-band imaging obtained as part of the IMACS Cluster Building Survey (ICBS), we measure Sérsic indices for 2160 field and cluster galaxies at 0.31 < z < 0.54. Using both mass- and magnitude-limited samples, we compare the distributions for spectroscopically determined passive, continuously star-forming, starburst, and post-starburst systems and show that previously established spatial and statistical connections between these types extend to their gross morphologies. Outside of cluster cores, we find close structural ties between starburst and continuously star-forming, as well as post-starburst and passive types, but not between starbursts and post-starbursts. These results independently support two conclusions presented in Paper II of this series: (1) most starbursts are the product of a non-disruptive triggering mechanism that is insensitive to global environment, such as minor mergers; (2) starbursts and post-starbursts generally represent transient phases in the lives of 'normal' star-forming and quiescent galaxies, respectively, originating from and returning to these systems in closed 'recycling' loops. In this picture, spectroscopically identified post-starbursts constitute a minority of all recently terminated starbursts, largely ruling out the typical starburst as a quenching event in all but the densest environments.

  12. The JCMT Gould Belt Survey: A Quantitative Comparison Between SCUBA-2 Data Reduction Methods

    E-print Network

    Mairs, S; Kirk, H; Graves, S; Buckle, J; Beaulieu, S F; Berry, D S; Broekhoven-Fiene, H; Currie, M J; Fich, M; Hatchell, J; Jenness, T; Mottram, J C; Nutter, D; Pattle, K; Pineda, J E; Salji, C; Di Francesco, J; Hogerheijde, M R; Ward-Thompson, D

    2015-01-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artifacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software, Starlink, but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth ...

  13. On the Design of Empirical Studies to Evaluate Software Patterns: A Survey

    E-print Network

    Young, R. Michael

    On the Design of Empirical Studies to Evaluate Software Patterns: A Survey Maria Riaz North researchers in designing empirical studies of software patterns by summarizing the study designs of software patterns available in the literature. The important components of these study designs include

  14. SDSS-IV MaNGA: Survey Design and Progress

    NASA Astrophysics Data System (ADS)

    Yan, Renbin; MaNGA Team

    2016-01-01

    The ongoing SDSS-IV/MaNGA Survey will obtain integral field spectroscopy at a resolution of R~2000 with a wavelength coverage from 3,600A to 10,300A for 10,000 nearby galaxies. Within each 3 degree diameter pointing of the 2.5m Sloan Telescope, we deploy 17 hexagonal fiber bundles with sizes ranging from 12 to 32 arcsec in diameter. The bundles are build with 2 arcsec fibers and have a 56% fill factor. During observations, we obtained sets of exposures at 3 different dither positions to achieve near-critical sampling of the effective point spread function, which has a FWHM about 2.5 arcsec, corresponding to 1-2 kpc for the majority of the galaxies targeted. The flux calibration is done using 12 additional mini-fiber-bundles targeting standard stars simultaneously with science targets, achieving a calibration accuracy better than 5% over 90% of the wavelength range. The target galaxies are selected to ensure uniform spatial coverage in units of effective radii for the majority of the galaxies while maximizing spatial resolution. About 2/3 of the sample is covered out to 1.5Re (primary sample) and 1/3 of the sample covered to 2.5Re (secondary sample). The sample is designed to have approximately equal representation from high and low mass galaxies while maintaining volume-limited selection at fixed absolute magnitudes. We obtain an average S/N of 4 per Angstrom in r-band continuum at a surface brightness of 23 AB arcsec-2. With spectral stacking in an elliptical annulus covering 1-1.5Re, our primary sample galaxies have a median S/N of ~60 per Angstrom in r-band.

  15. The JCMT Gould Belt Survey: a quantitative comparison between SCUBA-2 data reduction methods

    NASA Astrophysics Data System (ADS)

    Mairs, S.; Johnstone, D.; Kirk, H.; Graves, S.; Buckle, J.; Beaulieu, S. F.; Berry, D. S.; Broekhoven-Fiene, H.; Currie, M. J.; Fich, M.; Hatchell, J.; Jenness, T.; Mottram, J. C.; Nutter, D.; Pattle, K.; Pineda, J. E.; Salji, C.; Francesco, J. Di; Hogerheijde, M. R.; Ward-Thompson, D.; JCMT Gould Belt survey Team

    2015-12-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artefacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software (STARLINK) but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth physical analyses of star-forming regions. Using the GBS LR1 method, we find that compact sources are recovered well, even at a peak brightness of only three times the noise, whereas the reconstruction of larger objects requires much care when drawing boundaries around the expected astronomical signal in the data reduction process. Incorrect boundaries can lead to false structure identification or it can cause structure to be missed. In the JCMT LR1 reduction, the extent of the true structure of objects larger than a point source is never fully recovered.

  16. Applications of numerical optimization methods to helicopter design problems: A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  17. ESTIMATING AMPHIBIAN OCCUPANCY RATES IN PONDS UNDER COMPLEX SURVEY DESIGNS

    EPA Science Inventory

    Monitoring the occurrence of specific amphibian species in ponds is one component of the US Geological Survey's Amphibian Monitoring and Research Initiative. Two collaborative studies were conducted in Olympic National Park and southeastern region of Oregon. The number of ponds...

  18. Sample size and optimal sample design in tuberculosis surveys

    PubMed Central

    Sánchez-Crespo, J. L.

    1967-01-01

    Tuberculosis surveys sponsored by the World Health Organization have been carried out in different communities during the last few years. Apart from the main epidemiological findings, these surveys have provided basic statistical data for use in the planning of future investigations. In this paper an attempt is made to determine the sample size desirable in future surveys that include one of the following examinations: tuberculin test, direct microscopy, and X-ray examination. The optimum cluster sizes are found to be 100-150 children under 5 years of age in the tuberculin test, at least 200 eligible persons in the examination for excretors of tubercle bacilli (direct microscopy) and at least 500 eligible persons in the examination for persons with radiological evidence of pulmonary tuberculosis (X-ray). Modifications of the optimum sample size in combined surveys are discussed. PMID:5300008

  19. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    PubMed

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data. PMID:24902590

  20. Design-based treatment of unit nonresponse in environmental surveys using calibration weighting.

    PubMed

    Fattorini, Lorenzo; Franceschi, Sara; Maffei, Daniela

    2013-11-01

    Unit nonresponse is often a problem in sample surveys. It arises when the values of the survey variable cannot be recorded for some sampled units. In this paper, the use of nonresponse calibration weighting to treat nonresponse is considered in a complete design-based framework. Nonresponse is viewed as a fixed characteristic of the units. The approach is suitable in environmental and forest surveys when sampled sites cannot be reached by field crews. Approximate expressions of design-based bias and variance of the calibration estimator are derived and design-based consistency is investigated. Choice of auxiliary variables to perform calibration is discussed. Sen-Yates-Grundy, Horvitz-Thompson, and jackknife estimators of the sampling variance are proposed. Analytical and Monte Carlo results demonstrate the validity of the procedure when the relationship between survey and auxiliary variables is similar in respondent and nonrespondent strata. An application to a forest survey performed in Northeastern Italy is considered. PMID:24022794

  1. Methods for the design and administration of web-based surveys.

    PubMed

    Schleyer, T K; Forrest, J L

    2000-01-01

    This paper describes the design, development, and administration of a Web-based survey to determine the use of the Internet in clinical practice by 450 dental professionals. The survey blended principles of a controlled mail survey with data collection through a Web-based database application. The survey was implemented as a series of simple HTML pages and tested with a wide variety of operating environments. The response rate was 74.2 percent. Eighty-four percent of the participants completed the Web-based survey, and 16 percent used e-mail or fax. Problems identified during survey administration included incompatibilities/technical problems, usability problems, and a programming error. The cost of the Web-based survey was 38 percent less than that of an equivalent mail survey. A general formula for calculating breakeven points between electronic and hardcopy surveys is presented. Web-based surveys can significantly reduce turnaround time and cost compared with mail surveys and may enhance survey item completion rates. PMID:10887169

  2. Survey design for lakes and reservoirs in the United States to assess contaminants in fish tissue

    EPA Science Inventory

    The National Lake Fish Tissue Study (NLFTS) was the first survey of fish contamination in lakes and reservoirs in the 48 conterminous states based on probability survey design. This study included the largest set (268) of persistent, bioaccumulative, and toxic (PBT) chemicals ev...

  3. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    ERIC Educational Resources Information Center

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions…

  4. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  5. ROBUST CONTROL DESIGN OF A SINGLE DEGREE-OF-FREEDOM MAGNETIC LEVITATION SYSTEM BY QUANTITATIVE FEEDBACK THEORY

    E-print Network

    Nagurka, Mark L.

    ROBUST CONTROL DESIGN OF A SINGLE DEGREE-OF-FREEDOM MAGNETIC LEVITATION SYSTEM BY QUANTITATIVE Milwaukee, WI 53233 Email: mark.nagurka@marquette.edu ABSTRACT A magnetic levitation (maglev) system and varia- tions to improve system robustness. I. INTRODUCTION Magnetic levitation (maglev) technology

  6. THE OPTICALLY UNBIASED GAMMA-RAY BURST HOST (TOUGH) SURVEY. I. SURVEY DESIGN AND CATALOGS

    SciTech Connect

    Hjorth, Jens; Malesani, Daniele; Fynbo, Johan P. U.; Kruehler, Thomas; Milvang-Jensen, Bo; Watson, Darach; Jakobsson, Pall; Schulze, Steve; Jaunsen, Andreas O.; Gorosabel, Javier; Levan, Andrew J.; Michalowski, Michal J.; Moller, Palle; Tanvir, Nial R.

    2012-09-10

    Long-duration gamma-ray bursts (GRBs) are powerful tracers of star-forming galaxies. We have defined a homogeneous subsample of 69 Swift GRB-selected galaxies spanning a very wide redshift range. Special attention has been devoted to making the sample optically unbiased through simple and well-defined selection criteria based on the high-energy properties of the bursts and their positions on the sky. Thanks to our extensive follow-up observations, this sample has now achieved a comparatively high degree of redshift completeness, and thus provides a legacy sample, useful for statistical studies of GRBs and their host galaxies. In this paper, we present the survey design and summarize the results of our observing program conducted at the ESO Very Large Telescope (VLT) aimed at obtaining the most basic properties of galaxies in this sample, including a catalog of R and K{sub s} magnitudes and redshifts. We detect the host galaxies for 80% of the GRBs in the sample, although only 42% have K{sub s} -band detections, which confirms that GRB-selected host galaxies are generally blue. The sample is not uniformly blue, however, with two extremely red objects detected. Moreover, galaxies hosting GRBs with no optical/NIR afterglows, whose identification therefore relies on X-ray localizations, are significantly brighter and redder than those with an optical/NIR afterglow. This supports a scenario where GRBs occurring in more massive and dusty galaxies frequently suffer high optical obscuration. Our spectroscopic campaign has resulted in 77% now having redshift measurements, with a median redshift of 2.14 {+-} 0.18. TOUGH alone includes 17 detected z > 2 Swift GRB host galaxies suitable for individual and statistical studies-a substantial increase over previous samples. Seven hosts have detections of the Ly{alpha} emission line and we can exclude an early indication that Ly{alpha} emission is ubiquitous among GRB hosts, but confirm that Ly{alpha} is stronger in GRB-selected galaxies than in flux-limited samples of Lyman break galaxies.

  7. Submitted to : Computer Aided Design, 2000 A Survey of Computational Approaches to Three-

    E-print Network

    Shimada, Kenji

    1 Submitted to : Computer Aided Design, 2000 A Survey of Computational Approaches to Three, reject moves stop criteria topological connections attachment points route definition constraints non orientation objectives packing density center of gravity configuration cost routing cost bracket cost

  8. Targeting Urban Watershed Stressor Gradients: Stream Survey Design, Ecological Responses, and Implications of Land Cover Resolution

    EPA Science Inventory

    We conducted a stream survey in the Narragansett Bay Watershed designed to target a gradient of development intensity, and to examine how associated changes in nutrients, carbon, and stressors affect periphyton and macroinvertebrates. Concentrations of nutrients, cations, and ani...

  9. Theory of Model-Based Geophysical Survey and Experimental Design

    E-print Network

    a theoretical framework from the field of Statistical Experimental Design (SED - a field of statistics), within is left to the end of the companion article in Part B. 1: Introduction to Statistical Experimental Design structure (Maurer and Boerner, 1998a; Maurer et al., 2000); designing the interrogation of human experts

  10. HomoSAR: bridging comparative protein modeling with quantitative structural activity relationship to design new peptides.

    PubMed

    Borkar, Mahesh R; Pissurlenkar, Raghuvir R S; Coutinho, Evans C

    2013-11-15

    Peptides play significant roles in the biological world. To optimize activity for a specific therapeutic target, peptide library synthesis is inevitable; which is a time consuming and expensive. Computational approaches provide a promising way to simply elucidate the structural basis in the design of new peptides. Earlier, we proposed a novel methodology termed HomoSAR to gain insight into the structure activity relationships underlying peptides. Based on an integrated approach, HomoSAR uses the principles of homology modeling in conjunction with the quantitative structural activity relationship formalism to predict and design new peptide sequences with the optimum activity. In the present study, we establish that the HomoSAR methodology can be universally applied to all classes of peptides irrespective of sequence length by studying HomoSAR on three peptide datasets viz., angiotensin-converting enzyme inhibitory peptides, CAMEL-s antibiotic peptides, and hAmphiphysin-1 SH3 domain binding peptides, using a set of descriptors related to the hydrophobic, steric, and electronic properties of the 20 natural amino acids. Models generated for all three datasets have statistically significant correlation coefficients (r(2)) and predictive r2 (r(pred)2) and cross validated coefficient ( q(LOO)2). The daintiness of this technique lies in its simplicity and ability to extract all the information contained in the peptides to elucidate the underlying structure activity relationships. The difficulties of correlating both sequence diversity and variation in length of the peptides with their biological activity can be addressed. The study has been able to identify the preferred or detrimental nature of amino acids at specific positions in the peptide sequences. PMID:24105965

  11. ESTIMATING PROPORTION OF AREA OCCUPIED UNDER COMPLEX SURVEY DESIGNS

    EPA Science Inventory

    Estimating proportion of sites occupied, or proportion of area occupied (PAO) is a common problem in environmental studies. Typically, field surveys do not ensure that occupancy of a site is made with perfect detection. Maximum likelihood estimation of site occupancy rates when...

  12. Effect of survey design and catch rate estimation on total catch estimates in Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2012-01-01

    Roving–roving and roving–access creel surveys are the primary techniques used to obtain information on harvest of Chinook salmon Oncorhynchus tshawytscha in Idaho sport fisheries. Once interviews are conducted using roving–roving or roving–access survey designs, mean catch rate can be estimated with the ratio-of-means (ROM) estimator, the mean-of-ratios (MOR) estimator, or the MOR estimator with exclusion of short-duration (?0.5 h) trips. Our objective was to examine the relative bias and precision of total catch estimates obtained from use of the two survey designs and three catch rate estimators for Idaho Chinook salmon fisheries. Information on angling populations was obtained by direct visual observation of portions of Chinook salmon fisheries in three Idaho river systems over an 18-d period. Based on data from the angling populations, Monte Carlo simulations were performed to evaluate the properties of the catch rate estimators and survey designs. Among the three estimators, the ROM estimator provided the most accurate and precise estimates of mean catch rate and total catch for both roving–roving and roving–access surveys. On average, the root mean square error of simulated total catch estimates was 1.42 times greater and relative bias was 160.13 times greater for roving–roving surveys than for roving–access surveys. Length-of-stay bias and nonstationary catch rates in roving–roving surveys both appeared to affect catch rate and total catch estimates. Our results suggest that use of the ROM estimator in combination with an estimate of angler effort provided the least biased and most precise estimates of total catch for both survey designs. However, roving–access surveys were more accurate than roving–roving surveys for Chinook salmon fisheries in Idaho.

  13. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  14. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  15. Design and delivery of a national pilot survey of capabilities

    E-print Network

    Tenneti, Raji; Goodman-Deane, Joy; Langdon, Patrick; Waller, Sam; Ruggeri, Kai; Clarkson, P. John; Huppert, Felicia A.

    2014-02-19

    national pilot survey of capabilities Raji Tenneti*, Joy Goodman-Deane, Patrick Langdon, Sam Waller, Kai Ruggeri, P John Clarkson, Felicia A. Huppert Raji Tenneti* School of Primary, Aboriginal and Rural Health Care, Faculty of Medicine... @eng.cam.ac.uk Felicia A. Huppert Well-being Institute, Department of Psychiatry, University of Cambridge, Hills Road, Cambridge CB2 0QQ, UK Email: fah2@cam.ac.uk Abstract: Understanding the numbers of people with different levels...

  16. Practical aspects of applied optimized survey design for electrical resistivity tomography

    NASA Astrophysics Data System (ADS)

    Wilkinson, Paul B.; Loke, Meng Heng; Meldrum, Philip I.; Chambers, Jonathan E.; Kuras, Oliver; Gunn, David A.; Ogilvy, Richard D.

    2012-04-01

    The use of optimized resistivity tomography surveys to acquire field data imposes extra constraints on the design strategy beyond maximizing the quality of the resulting tomographic image. In this paper, methods are presented to (1) minimize electrode polarization effects (2) make efficient use of parallel measurement channels and (3) incorporate data noise estimates in the optimization process. (1) A simulated annealing algorithm is used to rearrange the optimized measurement sequences to minimize polarization errors. The method is developed using random survey designs and is demonstrated to be effective for use with single and multichannel optimized surveys. (2) An optimization algorithm is developed to design surveys by successive addition of multichannel groups of measurements rather than individual electrode configurations. The multichannel surveys are shown to produce results nearly as close to optimal as equivalent single channel surveys, while reducing data collection times by an order of magnitude. (3) Random errors in the data are accounted for by weighting the electrode configurations in the optimization process according to a simple error model incorporating background and voltage-dependent noise. The use of data weighting produces optimized surveys that are more robust in the presence of noise, while maintaining as much of the image resolution of the noise-free designs as possible. All the new methods described in this paper are demonstrated using both synthetic and real data, the latter having been measured on an active landslide using a permanently installed geoelectrical monitoring system.

  17. [Development of a simple quantitative method for the strontium-89 concentration of radioactive liquid waste using the plastic scintillation survey meter for beta rays].

    PubMed

    Narita, Hiroto; Tsuchiya, Yuusuke; Hirase, Kiyoshi; Uchiyama, Mayuki; Fukushi, Masahiro

    2012-11-01

    Strontium-89 (89Sr: pure beta, E; 1.495 MeV-100%, halflife: 50.5 days) chloride is used as pain relief from bone metastases. An assay of 89Sr is difficult because of a pure beta emitter. For management of 89Sr, we tried to evaluate a simple quantitative method for the 59Sr concentration of radioactive liquid waste using scintillation survey meter for beta rays. The counting efficiency of the survey meter with this method was 35.95%. A simple 30 minutes measurement of 2 ml of the sample made the quantitative measurement of 89Sr practical. Reducing self-absorption of the beta ray in the solution by counting on the polyethlene paper improved the counting efficiency. Our method made it easy to manage the radioactive liquid waste under the legal restrictions. PMID:23402205

  18. HETDEX pilot survey for emission-line galaxies - I. Survey design, performance, and catalog

    E-print Network

    Adams, Joshua J; Hill, Gary J; Gebhardt, Karl; Drory, Niv; Hao, Lei; Bender, Ralf; Byun, Joyce; Ciardullo, Robin; Cornell, Mark E; Finkelstein, Steven L; Fry, Alex; Gawiser, Eric; Gronwall, Caryl; Hopp, Ulrich; Jeong, Donghui; Kelz, Andreas; Kelzenberg, Ralf; Komatsu, Eiichiro; MacQueen, Phillip J; Murphy, Jeremy; Odoms, P Samuel; Roth, Martin; Schneider, Donald P; Tufts, Joseph R; Wilkinson, Christopher P

    2010-01-01

    We present a catalog of emission-line galaxies selected solely by their emission-line fluxes using a wide-field integral field spectrograph. This work is partially motivated as a pilot survey for the upcoming Hobby-Eberly Telescope Dark Energy Experiment (HETDEX). We describe the observations, reductions, detections, redshift classifications, line fluxes, and counterpart information for 397 emission-line galaxies detected over 169 sq.arcmin with a 3500-5800 Ang. bandpass under 5 Ang. full-width-half-maximum (FWHM) spectral resolution. The survey's best sensitivity for unresolved objects under photometric conditions is between 4-20 E-17 erg/s/sq.cm depending on the wavelength, and Ly-alpha luminosities between 3-6 E42 erg/s are detectable. This survey method complements narrowband and color-selection techniques in the search for high redshift galaxies with its different selection properties and large volume probed. The four survey fields within the COSMOS, GOODS-N, MUNICS, and XMM-LSS areas are rich with exist...

  19. A survey of aerobraking orbital transfer vehicle design concepts

    NASA Astrophysics Data System (ADS)

    Park, Chul

    1987-01-01

    The five existing design concepts of the aerobraking orbital transfer vehicle (namely, the raked sphere-cone designs, conical lifting-brake, raked elliptic-cone, lifting-body, and ballute) are reviewed and critiqued. Historical backgrounds, and the geometrical, aerothermal, and operational features of these designs are reviewed first. Then, the technological requirements for the vehicle (namely, navigation, aerodynamic stability and control, afterbody flow impingement, nonequilibrium radiation, convective heat-transfer rates, mission abort and multiple atmospheric passes, transportation and construction, and the payload-to-vehicle weight requirements) are delineated by summarizing the recent advancements made on these issues. Each of the five designs are critiqued and rated on these issues. The highest and the lowest ratings are given to the raked sphere-cone and the ballute design, respectively.

  20. Systematic Assessment of Survey Scan and MS2-Based Abundance Strategies for Label-Free Quantitative Proteomics Using High-Resolution MS Data

    PubMed Central

    2015-01-01

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R2 > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery. PMID:24635752

  1. European cardiac resynchronization therapy survey: rationale and design

    PubMed Central

    2009-01-01

    Aims The European cardiac resynchronization therapy (CRT) Survey is a joint initiative taken by the Heart Failure Association (HFA) and European Heart Rhythm Association (EHRA) of the European Society of Cardiology. The primary objective is to describe the current European practice and routines associated with CRT/CRT-D implantations based on a wide range of sampling in 13 countries. Methods and results The data collected should provide useful information, including demographics and clinical characteristics, diagnostic criteria, implantation routines and techniques, short-term outcomes, adverse experience, and assessment of adherence to guideline recommendations. PMID:19228801

  2. Modified Universal Design Survey: Enhancing Operability of Launch Vehicle Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for next generation space launch vehicles. Launch site ground operations include numerous operator tasks to prepare the vehicle for launch or to perform preflight maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To promote operability, a Design Quality Evaluation Survey based on Universal Design framework was developed to support Human Factors Engineering (HFE) evaluation for NASA s launch vehicles. Universal Design per se is not a priority for launch vehicle processing however; applying principles of Universal Design will increase the probability of an error free and efficient design which promotes operability. The Design Quality Evaluation Survey incorporates and tailors the seven Universal Design Principles and adds new measures for Safety and Efficiency. Adapting an approach proven to measure Universal Design Performance in Product, each principle is associated with multiple performance measures which are rated with the degree to which the statement is true. The Design Quality Evaluation Survey was employed for several launch vehicle ground processing worksite analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability.

  3. Sample and design considerations in post-disaster mental health needs assessment tracking surveys

    PubMed Central

    Kessler, Ronald C.; Keane, Terence M.; Ursano, Robert J.; Mokdad, Ali; Zaslavsky, Alan M.

    2009-01-01

    Although needs assessment surveys are carried out after many large natural and man-made disasters, synthesis of findings across these surveys and disaster situations about patterns and correlates of need is hampered by inconsistencies in study designs and measures. Recognizing this problem, the US Substance Abuse and Mental Health Services Administration (SAMHSA) assembled a task force in 2004 to develop a model study design and interview schedule for use in post-disaster needs assessment surveys. The US National Institute of Mental Health subsequently approved a plan to establish a center to implement post-disaster mental health needs assessment surveys in the future using an integrated series of measures and designs of the sort proposed by the SAMHSA task force. A wide range of measurement, design, and analysis issues will arise in developing this center. Given that the least widely discussed of these issues concerns study design, the current report focuses on the most important sampling and design issues proposed for this center based on our experiences with the SAMHSA task force, subsequent Katrina surveys, and earlier work in other disaster situations. PMID:19035440

  4. Hit by a Perfect Storm? Art & Design in the National Student Survey

    ERIC Educational Resources Information Center

    Yorke, Mantz; Orr, Susan; Blair, Bernadette

    2014-01-01

    There has long been the suspicion amongst staff in Art & Design that the ratings given to their subject disciplines in the UK's National Student Survey are adversely affected by a combination of circumstances--a "perfect storm". The "perfect storm" proposition is tested by comparing ratings for Art & Design with…

  5. THE HETDEX PILOT SURVEY. I. SURVEY DESIGN, PERFORMANCE, AND CATALOG OF EMISSION-LINE GALAXIES

    SciTech Connect

    Adams, Joshua J.; Blanc, Guillermo A.; Gebhardt, Karl; Hao, Lei; Byun, Joyce; Fry, Alex; Jeong, Donghui; Komatsu, Eiichiro; Hill, Gary J.; Cornell, Mark E.; MacQueen, Phillip J.; Drory, Niv; Bender, Ralf; Hopp, Ulrich; Kelzenberg, Ralf; Ciardullo, Robin; Gronwall, Caryl; Finkelstein, Steven L.; Gawiser, Eric; Kelz, Andreas

    2011-01-15

    We present a catalog of emission-line galaxies selected solely by their emission-line fluxes using a wide-field integral field spectrograph. This work is partially motivated as a pilot survey for the upcoming Hobby-Eberly Telescope Dark Energy Experiment. We describe the observations, reductions, detections, redshift classifications, line fluxes, and counterpart information for 397 emission-line galaxies detected over 169 {open_square}' with a 3500-5800 A bandpass under 5 A full-width-half-maximum (FWHM) spectral resolution. The survey's best sensitivity for unresolved objects under photometric conditions is between 4 and 20x 10{sup -17} erg s{sup -1} cm{sup -2} depending on the wavelength, and Ly{alpha} luminosities between 3 x 10{sup 42} and 6 x 10{sup 42} erg s{sup -1} are detectable. This survey method complements narrowband and color-selection techniques in the search of high-redshift galaxies with its different selection properties and large volume probed. The four survey fields within the COSMOS, GOODS-N, MUNICS, and XMM-LSS areas are rich with existing, complementary data. We find 105 galaxies via their high-redshift Ly{alpha} emission at 1.9 < z < 3.8, and the majority of the remainder objects are low-redshift [O II]3727 emitters at z < 0.56. The classification between low- and high-redshift objects depends on rest-frame equivalent width (EW), as well as other indicators, where available. Based on matches to X-ray catalogs, the active galactic nuclei fraction among the Ly{alpha} emitters is 6%. We also analyze the survey's completeness and contamination properties through simulations. We find five high-z, highly significant, resolved objects with FWHM sizes >44 {open_square}' which appear to be extended Ly{alpha} nebulae. We also find three high-z objects with rest-frame Ly{alpha} EW above the level believed to be achievable with normal star formation, EW{sub 0}>240 A. Future papers will investigate the physical properties of this sample.

  6. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    USGS Publications Warehouse

    Edwards, T.C., Jr.; Cutler, D.R.; Zimmermann, N.E.; Geiser, L.; Moisen, G.G.

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by resubstitution rates were similar for each lichen species irrespective of the underlying sample survey form. Cross-validation estimates of prediction accuracies were lower than resubstitution accuracies for all species and both design types, and in all cases were closer to the true prediction accuracies based on the EVALUATION data set. We argue that greater emphasis should be placed on calculating and reporting cross-validation accuracy rates rather than simple resubstitution accuracy rates. Evaluation of the DESIGN and PURPOSIVE tree models on the EVALUATION data set shows significantly lower prediction accuracy for the PURPOSIVE tree models relative to the DESIGN models, indicating that non-probabilistic sample surveys may generate models with limited predictive capability. These differences were consistent across all four lichen species, with 11 of the 12 possible species and sample survey type comparisons having significantly lower accuracy rates. Some differences in accuracy were as large as 50%. The classification tree structures also differed considerably both among and within the modelled species, depending on the sample survey form. Overlap in the predictor variables selected by the DESIGN and PURPOSIVE tree models ranged from only 20% to 38%, indicating the classification trees fit the two evaluated survey forms on different sets of predictor variables. The magnitude of these differences in predictor variables throws doubt on ecological interpretation derived from prediction models based on non-probabilistic sample surveys. ?? 2006 Elsevier B.V. All rights reserved.

  7. Net Survey: "Top Ten Mistakes" in Academic Web Design.

    ERIC Educational Resources Information Center

    Petrik, Paula

    2000-01-01

    Highlights the top ten mistakes in academic Web design: (1) bloated graphics; (2) scaling images; (3) dense text; (4) lack of contrast; (5) font size; (6) looping animations; (7) courseware authoring software; (8) scrolling/long pages; (9) excessive download; and (10) the nothing site. Includes resources. (CMK)

  8. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for the detection of abrupt changes (such as failures) in stochastic dynamical systems were surveyed. The class of linear systems were emphasized, but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  9. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for detecting abrupt changes (such as failures) in stochastic dynamical systems are surveyed. The class of linear systems is concentrated on but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  10. Lessons Learned in Interdisciplinary Professional Development Designed to Promote the Teaching of Quantitative Literacy

    ERIC Educational Resources Information Center

    Lardner, Emily; Bookman, Jack

    2013-01-01

    In this paper, we will describe the challenges and insights gained from conducting professional development workshops aimed at helping faculty prepare materials to support the development of students' quantitative skills in different disciplinary contexts. We will examine some of the mistakes we made, and misconceptions we had, in conducting…

  11. The inclusion of open-ended questions on quantitative surveys of children: Dealing with unanticipated responses relating to child abuse and neglect.

    PubMed

    Lloyd, Katrina; Devine, Paula

    2015-10-01

    Web surveys have been shown to be a viable, and relatively inexpensive, method of data collection with children. For this reason, the Kids' Life and Times (KLT) was developed as an annual online survey of 10 and 11 year old children. Each year, approximately 4,000 children participate in the survey. Throughout the six years that KLT has been running, a range of questions has been asked that are both policy-relevant and important to the lives of children. Given the method employed by the survey, no extremely sensitive questions that might cause the children distress are included. The majority of questions on KLT are closed yielding quantitative data that are analysed statistically; however, one regular open-ended question is included at the end of KLT each year so that the children can suggest questions that they think should be asked on the survey the following year. While most of the responses are innocuous, each year a small minority of children suggest questions on child abuse and neglect. This paper reports the responses to this question and reflects on how researchers can, and should, deal with this issue from both a methodological and an ethical perspective. PMID:25952476

  12. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  13. Sampling design for the 1980 commercial and multifamily residential building survey

    SciTech Connect

    Bowen, W.M.; Olsen, A.R.; Nieves, A.L.

    1981-06-01

    Details of a proposed sample design for the 1980 Commercial and Multifamily Building Energy Performance Survey are presented. The objective of the survey is to assess the extent to which new building design practices comply with the proposed 1980 Energy Budget Levels for Commercial and Multifamily Residential Building Designs (DEB/sub 80/). The procedure will be to: identify a small number of building types which account for the majority of commercial buildings constructed in the U.S.A.; conduct a separate survey for each building type; and include only buildings designed during 1980. For each building in the survey, the Design Energy Consumption (DEC/sub 80/) will be determined by the DOE2.1 computer program. The quantity X = (DEC/sub 80/ - DEB/sub 80/), will be calculated for each building as a measure of its compliance with DEB/sub 80/. These X quantities will then be used to compute sample statistics. Inferences about nationwide compliance with DEB/sub 80/ may then be made for each building type. This report provides details of the population, sampling frame, stratification, sample size, and implementation of the sampling plan.

  14. Application of a Modified Universal Design Survey for Evaluation of Ares 1 Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for NASA's Ares 1 launch vehicle. Launch site ground operations include several operator tasks to prepare the vehicle for launch or to perform maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To support design evaluation, the Ares 1 Upper Stage (US) element Human Factors Engineering (HFE) group developed a survey based on the Universal Design approach. Universal Design is a process to create products that can be used effectively by as many people as possible. Universal Design per se is not a priority for Ares 1 because launch vehicle processing is a specialized skill and not akin to a consumer product that should be used by all people of all abilities. However, applying principles of Universal Design will increase the probability of an error free and efficient design which is a priority for Ares 1. The Design Quality Evaluation Survey centers on the following seven principles: (1) Equitable use, (2) Flexibility in use, (3) Simple and intuitive use, (4) Perceptible information, (5) Tolerance for error, (6) Low physical effort, (7) Size and space for approach and use. Each principle is associated with multiple evaluation criteria which were rated with the degree to which the statement is true. All statements are phrased in the utmost positive, or the design goal so that the degree to which judgments tend toward "completely agree" directly reflects the degree to which the design is good. The Design Quality Evaluation Survey was employed for several US analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability

  15. Composite Interval Mapping Based on Lattice Design for Error Control May Increase Power of Quantitative Trait Locus Detection

    PubMed Central

    Huang, Zhongwen; Zhao, Tuanjie; Xing, Guangnan; Gai, Junyi; Guan, Rongzhan

    2015-01-01

    Experimental error control is very important in quantitative trait locus (QTL) mapping. Although numerous statistical methods have been developed for QTL mapping, a QTL detection model based on an appropriate experimental design that emphasizes error control has not been developed. Lattice design is very suitable for experiments with large sample sizes, which is usually required for accurate mapping of quantitative traits. However, the lack of a QTL mapping method based on lattice design dictates that the arithmetic mean or adjusted mean of each line of observations in the lattice design had to be used as a response variable, resulting in low QTL detection power. As an improvement, we developed a QTL mapping method termed composite interval mapping based on lattice design (CIMLD). In the lattice design, experimental errors are decomposed into random errors and block-within-replication errors. Four levels of block-within-replication errors were simulated to show the power of QTL detection under different error controls. The simulation results showed that the arithmetic mean method, which is equivalent to a method under random complete block design (RCBD), was very sensitive to the size of the block variance and with the increase of block variance, the power of QTL detection decreased from 51.3% to 9.4%. In contrast to the RCBD method, the power of CIMLD and the adjusted mean method did not change for different block variances. The CIMLD method showed 1.2- to 7.6-fold higher power of QTL detection than the arithmetic or adjusted mean methods. Our proposed method was applied to real soybean (Glycine max) data as an example and 10 QTLs for biomass were identified that explained 65.87% of the phenotypic variation, while only three and two QTLs were identified by arithmetic and adjusted mean methods, respectively. PMID:26076140

  16. Hepatitis C Virus RNA Real-Time Quantitative RT-PCR Method Based on a New Primer Design Strategy.

    PubMed

    Chen, Lida; Li, Wenli; Zhang, Kuo; Zhang, Rui; Lu, Tian; Hao, Mingju; Jia, Tingting; Sun, Yu; Lin, Guigao; Wang, Lunan; Li, Jinming

    2016-01-01

    Viral nucleic acids are unstable when improperly collected, handled, and stored, resulting in decreased sensitivity of currently available commercial quantitative nucleic acid testing kits. Using known unstable hepatitis C virus RNA, we developed a quantitative RT-PCR method based on a new primer design strategy to reduce the impact of nucleic acid instability on nucleic acid testing. The performance of the method was evaluated for linearity, limit of detection, precision, specificity, and agreement with commercial hepatitis C virus assays. Its clinical application was compared to that of two commercial kits-Cobas AmpliPrep/Cobas TaqMan (CAP/CTM) and Kehua. The quantitative RT-PCR method delivered a good performance, with a linearity of R(2) = 0.99, a total limit of detection (genotypes 1 to 6) of 42.6 IU/mL (95% CI, 32.84 to 67.76 IU/mL), a CV of 1.06% to 3.34%, a specificity of 100%, and a high concordance with the CAP/CTM assay (R(2) = 0.97), with a means ± SD value of -0.06 ± 1.96 log IU/mL (range, -0.38 to 0.25 log IU/mL). The method was superior to commercial assays in detecting unstable hepatitis C virus RNA (P < 0.05). This quantitative RT-PCR method can effectively eliminate the influence of RNA instability on nucleic acid testing. The principle of primer design strategy may be applied to the detection of other RNA or DNA viruses. PMID:26612712

  17. Final report on the radiological surveys of designated DX firing sites at Los Alamos National Laboratory

    SciTech Connect

    1996-09-09

    CHEMRAD was contracted by Los Alamos National Laboratory to perform USRADS{reg_sign} (UltraSonic Ranging And Data System) radiation scanning surveys at designated DX Sites at the Los Alamos National Laboratory. The primary purpose of these scanning surveys was to identify the presence of Depleted Uranium (D-38) resulting from activities at the DX Firing Sites. This effort was conducted to update the most recent surveys of these areas. This current effort was initiated with site orientation on August 12, 1996. Surveys were completed in the field on September 4, 1996. This Executive Summary briefly presents the major findings of this work. The detail survey results are presented in the balance of this report and are organized by Technical Area and Site number in section 2. This organization is not in chronological order. USRADS and the related survey methods are described in section 3. Quality Control issues are addressed in section 4. Surveys were conducted with an array of radiation detectors either mounted on a backpack frame for man-carried use (Manual mode) or on a tricycle cart (RadCart mode). The array included radiation detectors for gamma and beta surface near surface contamination as well as dose rate at 1 meter above grade. The radiation detectors were interfaced directly to an USRADS 2100 Data Pack.

  18. Wide Field Infrared Survey Telescope [WFIRST]: Telescope Design and Simulated Performance

    NASA Technical Reports Server (NTRS)

    Goullioud, R.; Content, D. A.; Kuan, G. M.; Moore, J. D.; Chang, Z.; Sunada, E. T.; Villalvazo, J.; Hawk, J. P.; Armani, N. V.; Johnson, E. L.; Powell, C. A.

    2012-01-01

    The ASTRO2010 Decadal Survey proposed multiple missions with NIR focal planes and 3 mirror wide field telescopes in the 1.5m aperture range. None of them would have won as standalone missions WFIRST is a combination of these missions, created by Astro 2010 committee. WFIRST Science Definition Team (SDT) tasked to examine the design. Project team is a GSFC-JPL-Caltech collaboration. This interim mission design is a result of combined work by the project team with the SDT.

  19. Optical Design Trade Study for the Wide Field Infrared Survey Telescope [WFIRST

    NASA Technical Reports Server (NTRS)

    Content, David A.; Goullioud, R.; Lehan, John P.; Mentzell, John E.

    2011-01-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission concept was ranked first in new space astrophysics mission by the Astro2010 Decadal Survey incorporating the Joint Dark Energy Mission (JDEM)-Omega payload concept and multiple science white papers. This mission is based on a space telescope at L2 studying exoplanets [via gravitational microlensing], probing dark energy, and surveying the near infrared sky. Since the release of NWNH, the WFIRST project has been working with the WFIRST science definition team (SDT) to refine mission and payload concepts. We present the driving requirements. The current interim reference mission point design, based on the use of a 1.3m unobscured aperture three mirror anastigmat form, with focal imaging and slitless spectroscopy science channels, is consistent with the requirements, requires no technology development, and out performs the JDEM-Omega design.

  20. Epidemiological survey of anti-flea IgE in dogs in Japan by using an antigen-specific IgE quantitative measurement method

    PubMed Central

    Ichikawa, Y.; Beugnet, F.

    2012-01-01

    In Japan, an epidemiological survey was performed in dogs from October to December 2008 by using a quantitative measurement method for antigen-specific IgE towards specific Ctenocephalides felis antigens. 214 dogs from 22 veterinary clinics were included. These clinics were located as follows, from North to South: Hokkaido, Aomori, Fukushima, Tochigi, Saitama, Chiba, Tokyo (Tama-City and Ota-ku), Kanagawa, Gifu, Niigata, Kyoto, Nara, Osaka, Hyogo, Kagawa, Ehime, Hiroshima, Yamaguchi, Fukuoka, Kumamoto and Kagoshima. 110 dogs (51.4%) were seropositive for flea-specific IgE. No differences were associated with gender or breed. This survey confirms that flea infestation in dogs is a common problem in Japan. It especially shows that the infestation also occurs in Northern Japan where fleas are considered uncommon by the vet. PMID:22550629

  1. CONDITION ASSESSMENT FOR THE ESCAMBIA RIVER, FL, WATERSHED: BENTHIC MACROINVERTEBRATE SURVEYS USING A PROBABILISTIC SAMPLING DESIGN

    EPA Science Inventory

    Probabilistic sampling has been used to assess the condition of estuarine ecosystems, and the use of this survey design approach was examined for a northwest Florida watershed. Twenty-eight lotic sites within the Escambia River, Florida, watershed were randomly selected and visit...

  2. ASSESSING THE ECOLOGICAL CONDITION OF A COASTAL PLAIN WATERSHED USING A PROBABILISTIC SURVEY DESIGN

    EPA Science Inventory

    Using a probabilistic survey design, we assessed the ecological condition of the Florida (USA) portion of the Escambia River watershed using selected environmental and benthic macroinvertebrate data. Macroinvertebrates were sampled at 28 sites during July-August 1996, and 3414 i...

  3. USING GIS TO GENERATE SPATIALLY-BALANCED RANDOM SURVEY DESIGNS FOR NATURAL RESOURCE APPLICATIONS

    EPA Science Inventory

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sam...

  4. RESEARCH VESSEL SURVEY DESIGN FOR MONITORING DOLPHIN ABUNDANCE IN THE EASTERN TROPICAL PACIFIC

    E-print Network

    RESEARCH VESSEL SURVEY DESIGN FOR MONITORING DOLPHIN ABUNDANCE IN THE EASTERN TROPICAL PACIFIC Service began conducting long-term research ship sur- veys to determine status ofspotted dolphin, Stenella attenuata. stocks in the eastern tropical Pacific. This is the main dolphin species taken incidentally

  5. Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys

    Cancer.gov

    Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys Cristine Delnevo, PhD, MPH UMDNJ-School of Public Health Why is methods research in Tobacco Surveillance important? z Measuring individual behavior over time is crucial

  6. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    PubMed Central

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis. PMID:26125967

  7. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    PubMed

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis. PMID:26125967

  8. Estimation of wildlife population ratios incorporating survey design and visibility bias

    USGS Publications Warehouse

    Samuel, M.D.; Steinhorst, R.K.; Garton, E.O.; Unsworth, J.W.

    1992-01-01

    Age and sex ratio statistics are often a key component of the evaluation and management of wildlife populations. These statistics are determined from counts of animals that are commonly plagued by errors associated with either survey design or visibility bias. We present age and sex ratio estimators that incorporate both these sources of error and include the typical situation that animals are sampled in groups. Aerial surveys of elk (Cervus elaphus) in northcentral Idaho illustrate that differential visibility of age or sex classes can produce biased ratio estimates. Visibility models may be used to provide corrected estimates of ratios and their variability that incorporates errors due to sampling, visibility bias, and visibility estimation.

  9. Distance software: design and analysis of distance sampling surveys for estimating population size

    PubMed Central

    Thomas, Len; Buckland, Stephen T; Rexstad, Eric A; Laake, Jeff L; Strindberg, Samantha; Hedley, Sharon L; Bishop, Jon RB; Marques, Tiago A; Burnham, Kenneth P

    2010-01-01

    1.Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2.We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3.Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4.A first step in analysis of distance sampling data is modelling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5.All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6.Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modelling analysis engine for spatial and habitat modelling, and information about accessing the analysis engines directly from other software. 7.Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of-the-art software that implements these methods is described that makes the methods accessible to practising ecologists. PMID:20383262

  10. Loop Shaping Control Design for a Supersonic Propulsion System Model Using Quantitative Feedback Theory (QFT) Specifications and Bounds

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Kopasakis, George

    2010-01-01

    This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.

  11. A quantitative methodology for mapping project costs to engineering decisions in naval ship design and procurement

    E-print Network

    Netemeyer, Kristopher David

    2010-01-01

    Alternative methods for cost estimation are important in the early conceptual stages of a design when there is not enough detail to allow for a traditional quantity takeoff estimate to be performed. Much of the budgeting ...

  12. Screen Design Guidelines for Motivation in Interactive Multimedia Instruction: A Survey and Framework for Designers.

    ERIC Educational Resources Information Center

    Lee, Sung Heum; Boling, Elizabeth

    1999-01-01

    Identifies guidelines from the literature relating to screen design and design of interactive instructional materials. Describes two types of guidelines--those aimed at enhancing motivation and those aimed at preventing loss of motivation--for typography, graphics, color, and animation and audio. Proposes a framework for considering motivation in…

  13. Nonexperimental Quantitative Research and Its Role in Guiding Instruction

    ERIC Educational Resources Information Center

    Cook, Bryan G.; Cook, Lysandra

    2008-01-01

    Different research designs answer different questions. Educators cannot use nonexperimental quantitative research designs, such as descriptive surveys and correlational research, to determine definitively that an intervention causes improved student outcomes and is an evidence-based practice. However, such research can (a) inform educators about a…

  14. A survey of quantitative real-time polymerase chain reaction internal reference genes for expression studies in Brassica napus.

    PubMed

    Chen, Xue; Truksa, Martin; Shah, Saleh; Weselake, Randall J

    2010-10-01

    Eight reference genes of Brassica napus were evaluated using quantitative real-time polymerase chain reaction (qRT-PCR) data, focusing on vegetative tissues and developing embryos. Analyses of expression stability indicated that UP1, UBC9, UBC21, and TIP41 were the top four choices as stably expressed reference genes for vegetative tissues, whereas ACT7, UBC21, TIP41, and PP2A were the top four choices for maturing embryos. In addition, radiolabeling of overall messenger RNA (mRNA) of maturing embryos indicated that the expression patterns of the top four ranked reference genes reflected the overall mRNA content changes in maturing embryos. PMID:20522329

  15. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  16. Design and Performance Considerations for the Quantitative Measurement of HEU Residues Resulting from 99Mo Production

    SciTech Connect

    McElroy, Robert Dennis; Chapman, Jeffrey Allen; Bogard, James S; Belian, Anthony P

    2011-01-01

    Molybdenum-99 is produced by the irradiation of high-enriched uranium (HEU) resulting in the accumulation of large quantities of HEU residues. In general, these residues are not recycled but are either disposed of or stored in containers with surface exposure rates as high as 100 R/h. The 235U content of these waste containers must be quantified for both accountability and waste disposal purposes. The challenges of quantifying such difficult-to-assay materials are discussed, along with performance estimates for each of several potential assay options. In particular, the design and performance of a High Activity Active Well Coincidence Counting (HA-AWCC) system designed and built specifically for these irradiated HEU waste materials are presented.

  17. Design of the stereoscopic eye-tracking system for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Sergeyev, Aleksandr; Levin, Eugene; Roggemann, Michael C.; Gienko, Gennady

    2008-08-01

    Spatial and temporal data derived from eye movements, compiled while the human eye observes geospatial imagery, retain meaningful and usable information. When human perceives the stereo effect, the virtual three dimensional (3D) model resulting from eye-brain interaction is generated in the mind. If the eye movements are recorded while the virtual model is observed, it is possible to reconstruct a 3D geometrical model almost identical to the one generated in the human brain. Information obtained from eye-movements can be utilized in many ways for remote sensing applications such as geospatial image analysis and interpretation. There are various eyetracking systems available on the market; however, none of them is designed to work with stereoscopic imagery. We explore different approaches and designs of the most suitable and non-intrusive scheme for stereoscopic image viewing in the eye-tracking systems to observe and analyze 3D visual models. The design of the proposed system is based on the optical separation method, which provides visually comfortable environment for perception of stereoscopic imagery. A proof of concept solution is based on multiple mirror-lens assembly that provides a significant reduction of geometrical constrains in eye-frame capturing. Two projected solutions: for wide-angle of viewing and helmet-integrated eye-tracker are also discussed here.

  18. Designing quantitative structure activity relationships to predict specific toxic endpoints for polybrominated diphenyl ethers in mammalian cells.

    PubMed

    Rawat, S; Bruce, E D

    2014-01-01

    Polybrominated diphenyl ethers (PBDEs) are known as effective flame retardants and have vast industrial application in products like plastics, building materials and textiles. They are found to be structurally similar to thyroid hormones that are responsible for regulating metabolism in the body. Structural similarity with the hormones poses a threat to human health because, once in the system, PBDEs have the potential to affect thyroid hormone transport and metabolism. This study was aimed at designing quantitative structure-activity relationship (QSAR) models for predicting toxic endpoints, namely cell viability and apoptosis, elicited by PBDEs in mammalian cells. Cell viability was evaluated quantitatively using a general cytotoxicity bioassay using Janus Green dye and apoptosis was evaluated using a caspase assay. This study has thus modelled the overall cytotoxic influence of PBDEs at an early and a late endpoint by the Genetic Function Approximation method. This research was a twofold process including running in vitro bioassays to collect data on the toxic endpoints and modeling the evaluated endpoints using QSARs. Cell viability and apoptosis responses for Hep G2 cells exposed to PBDEs were successfully modelled with an r(2) of 0.97 and 0.94, respectively. PMID:24738916

  19. The GABRIEL Advanced Surveys: study design, participation and evaluation of bias.

    PubMed

    Genuneit, Jon; Büchele, Gisela; Waser, Marco; Kovacs, Katalin; Debinska, Anna; Boznanski, Andrzej; Strunz-Lehner, Christine; Horak, Elisabeth; Cullinan, Paul; Heederik, Dick; Braun-Fahrländer, Charlotte; von Mutius, Erika

    2011-09-01

    Exposure to farming environments has been shown to protect substantially against asthma and atopic disease across Europe and in other parts of the world. The GABRIEL Advanced Surveys (GABRIELA) were conducted to determine factors in farming environments which are fundamental to protecting against asthma and atopic disease. The GABRIEL Advanced Surveys have a multi-phase stratified design. In a first-screening phase, a comprehensive population-based survey was conducted to assess the prevalence of exposure to farming environments and of asthma and atopic diseases (n?=?103,219). The second phase was designed to ascertain detailed exposure to farming environments and to collect biomaterial and environmental samples in a stratified random sample of phase 1 participants (n?=?15,255). A third phase was carried out in a further stratified sample only in Bavaria, southern Germany, aiming at in-depth respiratory disease and exposure assessment including extensive environmental sampling (n?= 895). Participation rates in phase 1 were around 60% but only about half of the participating study population consented to further study modules in phase 2. We found that consenting behaviour was related to familial allergies, high parental education, wheeze, doctor diagnosed asthma and rhinoconjunctivitis, and to a lesser extent to exposure to farming environments. The association of exposure to farm environments with asthma or rhinoconjunctivitis was not biased by participation or consenting behaviour. The GABRIEL Advanced Surveys are one of the largest studies to shed light on the protective 'farm effect' on asthma and atopic disease. Bias with regard to the main study question was able to be ruled out by representativeness and high participation rates in phases 2 and 3. The GABRIEL Advanced Surveys have created extensive collections of questionnaire data, biomaterial and environmental samples promising new insights into this area of research. PMID:21819425

  20. Surveys

    Cancer.gov

    Behavioral Risk Factor Surveillance System (BRFSS) The world's largest, on-going telephone health survey system, tracking health conditions and risk behaviors in the United States yearly since 1984. Currently, data are collected monthly in all 50 states, the District of Columbia, Puerto Rico, the U.S. Virgin Islands, and Guam.

  1. Design of eye models used in quantitative analysis of interaction between chromatic and higher-order aberrations of eye

    NASA Astrophysics Data System (ADS)

    Zhai, Yi; Wang, Yan; Wang, Zhaoqi; Liu, Yongji; Zhang, Lin; He, Yuanqing; Chang, Shengjiang

    2014-12-01

    Special kinds of eye models are constructed by means of optical system design to quantitatively investigate impacts of longitudinal chromatic aberration (LCA), transverse chromatic aberration (TCA) and LCA+TCA on retina image quality and on depth of focus (DOF), as well as interaction between chromatic and higher-order aberrations. Results show that LCA plays a main role in enhancement of DOF and higher-order aberrations further increase DOF. For most of the eyes the impact of higher-order aberrations on vision is much smaller than that of LCA+TCA and the presence of LCA+TCA further reduces the impact of higher-order aberrations. The impact of LCA approximates to that of LCA+TCA, and the impact of TCA approximates to that of normal level of higher-order aberrations and is negligible.

  2. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    E-print Network

    Newman, Jeffrey A; Davis, Marc; Faber, S M; Coil, Alison L; Guhathakurta, Puragra; Koo, David C; Phillips, Andrew C; Conroy, Charlie; Dutton, Aaron A; Finkbeiner, Douglas P; Gerke, Brian F; Rosario, David J; Weiner, Benjamin J; Willmer, Christopher N A; Yan, Renbin; Harker, Justin J; Kassin, Susan A; Konidaris, Nicholas P; Lai, Kamson; Madgwick, Darren S; Noeske, Kai G; Wirth, Gregory D; Connolly, Andrew J; Kaiser, Nick; Kirby, Evan N; Lemaux, Brian C; Lin, Lihwai; Lotz, Jennifer M; Luppino, Gerard A; Marinoni, Christian; Matthews, Daniel J; Metevier, Anne; Schiavon, Ricardo P

    2012-01-01

    We describe the design and data sample from the DEEP2 Galaxy Redshift Survey, the densest and largest precision-redshift survey of galaxies at z ~ 1 completed to date. The survey has conducted a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M_B = -20 at z ~ 1 via ~90 nights of observation on the DEIMOS spectrograph at Keck Observatory. DEEP2 covers an area of 2.8 deg^2 divided into four separate fields, observed to a limiting apparent magnitude of R_AB=24.1. Objects with z 0.7 to be targeted ~2.5 times more efficiently than in a purely magnitude-limited sample. Approximately sixty percent of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets which fail to yield secure redshifts are blue objects that lie beyond z ~ 1.45. The DEIMOS 1200-line/mm grating used for the survey delivers high spectral resolution (R~6000), accurate and secu...

  3. Design and methods of the Adult Inuit Health Survey 2007–2008

    PubMed Central

    Saudny, Helga; Leggee, Donna; Egeland, Grace

    2012-01-01

    Background The Canadian International Polar Year (IPY) program made it possible to undertake much needed health research in 3 jurisdictions within the Canadian Inuit Nunangat (homeland) over a 2-year period: Inuvialuit Settlement Region (ISR), Nunavut Territory, and Nunatsiavut. Design The Adult Inuit Health Survey (IHS) was a cross-sectional survey and provides baseline data upon which future comparisons can be made for prospectively assessing factors leading to the progression of chronic diseases among Canadian Inuit. With the help of the Canadian Coast Guard Ship Amundsen, which was equipped with research and laboratory facilities, 33 coastal communities were visited; land survey teams visited 3 inland communities. Results The Adult IHS succeeded in obtaining important baseline information concerning the health status and living conditions of 2,595 adults living in ISR, Nunavut and Nunatsiavut. Conclusion Information from this survey will be useful for future comparisons and the opportunity to link with the International Inuit Cohort, a follow-up evaluation, and for the development of future health policies and public health interventions. PMID:23166895

  4. Improving the design of acoustic and midwater trawl surveys through stratification, with an application to Lake Michigan prey fishes

    USGS Publications Warehouse

    Adams, J.V.; Argyle, R.L.; Fleischer, G.W.; Curtis, G.L.; Stickel, R.G.

    2006-01-01

    Reliable estimates of fish biomass are vital to the management of aquatic ecosystems and their associated fisheries. Acoustic and midwater trawl surveys are an efficient sampling method for estimating fish biomass in large bodies of water. To improve the precision of biomass estimates from combined acoustic and midwater trawl surveys, sampling effort should be optimally allocated within each stage of the survey design. Based on information collected during fish surveys, we developed an approach to improve the design of combined acoustic and midwater trawl surveys through stratification. Geographic strata for acoustic surveying and depth strata for midwater trawling were defined using neighbor-restricted cluster analysis, and the optimal allocation of sampling effort for each was then determined. As an example, we applied this survey stratification approach to data from lakewide acoustic and midwater trawl surveys of Lake Michigan prey fishes. Precision of biomass estimates from surveys with and without geographic stratification was compared through resampling. Use of geographic stratification with optimal sampling allocation reduced the variance of Lake Michigan acoustic biomass estimates by 77%. Stratification and optimal allocation at each stage of an acoustic and midwater trawl survey should serve to reduce the variance of the resulting biomass estimates.

  5. "Intelligent design" of a 3D reflection survey for the SAFOD drill-hole site

    NASA Astrophysics Data System (ADS)

    Alvarez, G.; Hole, J. A.; Klemperer, S. L.; Biondi, B.; Imhof, M.

    2003-12-01

    SAFOD seeks to better understand the earthquake process by drilling though the San Andreas fault (SAF) to sample an earthquake in situ. To capitalize fully on the opportunities presented by the 1D drill-hole into a complex fault zone we must characterize the surrounding 3D geology at a scale commensurate with the drilling observations, to provide the structural context to extrapolate 1D drilling results along the fault plane and into the surrounding 3D volume. Excellent active-2D and passive-3D seismic observations completed and underway lack the detailed 3D resolution required. Only an industry-quality 3D reflection survey can provide c. 25 m subsurface sample-spacing horizontally and vertically. A 3D reflection survey will provide subsurface structural and stratigraphic control at the 100-m level, mapping major geologic units, structural boundaries, and subsurface relationships between the many faults that make up the SAF fault system. A principal objective should be a reflection-image (horizon-slice through the 3D volume) of the near-vertical fault plane(s) to show variations in physical properties around the drill-hole. Without a 3D reflection image of the fault zone, we risk interpreting drilled anomalies as ubiquitous properties of the fault, or risk missing important anomalies altogether. Such a survey cannot be properly costed or technically designed without major planning. "Intelligent survey design" can minimize source and receiver effort without compromising data-quality at the fault target. Such optimization can in principal reduce the cost of a 3D seismic survey by a factor of two or three, utilizing the known surface logistic constraints, partially-known sub-surface velocity field, and the suite of scientific targets at SAFOD. Our methodology poses the selection of the survey parameters as an optimization process that allows the parameters to vary spatially in response to changes in the subsurface. The acquisition geometry is locally optimized for uniformity of subsurface illumination by a micro-genetic algorithm. We start by accurately establishing the correspondence between the subsurface area of the target reflector (in this case, the steeply-dipping SAF) and the part of the surface area whose sources and receivers contribute to its image using 3D ray-tracing. We then use dense acquisition parameters in that part of the survey area and use standard parameters in the rest of the survey area. This is the key idea that allows us to get optimum image quality with the least acquisition effort. The optimization also requires constraints from structural geologists and from the community who will interpret the results. The most critical parameters to our optimization process are the structural model of the target(s) (depth and geological dips) and the velocity model in the subsurface. We seek community input, and have formed a scientific advisory committee of academic and industry leaders, to help evaluate trade-offs for the community between cost, resolution and volume of the resultant data-set, and to ensure that an appropriate range of piggy-back experiments is developed to utilize the seismic sources available during the 3D experiment. The scientific output of our project will be a community-vetted design for a 3D reflection survey over SAFOD that is technically feasible, cost-effective, and most likely to yield the image and seismic parameter measurements that will best constrain the physical properties of the fault zone and their spatial variation.

  6. KUIPER BELT OBJECT OCCULTATIONS: EXPECTED RATES, FALSE POSITIVES, AND SURVEY DESIGN

    SciTech Connect

    Bickerton, S. J.; Welch, D. L.; Kavelaars, J. J. E-mail: welch@physics.mcmaster.ca

    2009-05-15

    A novel method of generating artificial scintillation noise is developed and used to evaluate occultation rates and false positive rates for surveys probing the Kuiper Belt with the method of serendipitous stellar occultations. A thorough examination of survey design shows that (1) diffraction-dominated occultations are critically (Nyquist) sampled at a rate of 2 Fsu{sup -1}, corresponding to 40 s{sup -1} for objects at 40 AU, (2) occultation detection rates are maximized when targets are observed at solar opposition, (3) Main Belt asteroids will produce occultations light curves identical to those of Kuiper Belt Objects (KBOs) if target stars are observed at solar elongations of: 116{sup 0} {approx}< {epsilon} {approx}< 125 deg., or 131 deg. {approx}< {epsilon} {approx}< 141 deg., and (4) genuine KBO occultations are likely to be so rare that a detection threshold of {approx}>7-8{sigma} should be adopted to ensure that viable candidate events can be disentangled from false positives.

  7. Object-oriented modeling and design for sloan digital sky survey retained data

    SciTech Connect

    Huang, C.H.; Munn, J.; Yanny, B.; Kent, S.; Petravick, D.; Pordes, R.; Szalay, A.; Brunner, R.

    1995-12-01

    The SDSS project will produce tens of terabytes of data with nonionships among them and with uncertain complexity in their usage. The survey is being conducted by an international collaboration of eight institutions scattered throughout the US and Japan as well as numerous individuals at other sites. The data archive must provide adequate access to all collaborating partners during the five-year survey lifetime to support: development and testing of software algorithms; quality analysis on both the raw and processed data; selection of spectroscopic targets from the photometric catalogs; and scientific analysis. Additionally, the archive will serve as the basis for the public distribution of the final calibrated data on a timely basis. In this paper, we document how we applied Object-Oriented modeling design to the development of data archives. In the end, based on the experiences, we put Object-Orientation in a proper perspective.

  8. Implementing the World Mental Health Survey Initiative in Portugal – rationale, design and fieldwork procedures

    PubMed Central

    2013-01-01

    Background The World Mental Health Survey Initiative was designed to evaluate the prevalence, the correlates, the impact and the treatment patterns of mental disorders. This paper describes the rationale and the methodological details regarding the implementation of the survey in Portugal, a country that still lacks representative epidemiological data about psychiatric disorders. Methods The World Mental Health Survey is a cross-sectional study with a representative sample of the Portuguese population, aged 18 or older, based on official census information. The WMH-Composite International Diagnostic Interview, adapted to the Portuguese language by a group of bilingual experts, was used to evaluate the mental health status, disorder severity, impairment, use of services and treatment. Interviews were administered face-to-face at respondent’s dwellings, which were selected from a nationally representative multi-stage clustered area probability sample of households. The survey was administered using computer-assisted personal interview methods by trained lay interviewers. Data quality was strictly controlled in order to ensure the reliability and validity of the collected information. Results A total of 3,849 people completed the main survey, with 2,060 completing the long interview, with a response rate of 57.3%. Data cleaning was conducted in collaboration with the WMHSI Data Analysis Coordination Centre at the Department of Health Care Policy, Harvard Medical School. Collected information will provide lifetime and 12-month mental disorders diagnoses, according to the International Classification of Diseases and to the Diagnostic and Statistical Manual of Mental Disorders. Conclusions The findings of this study could have a major influence in mental health care policy planning efforts over the next years, specially in a country that still has a significant level of unmet needs regarding mental health services organization, delivery of care and epidemiological research. PMID:23837605

  9. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    PubMed

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from <40% (frogs and toads, snakes) to >60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species. PMID:26232568

  10. Campsite survey implications for managing designated campsites at Great Smoky Mountains National Park

    USGS Publications Warehouse

    Marion, J.L.; Leung, Y.-F.

    1998-01-01

    Backcountry campsites and shelters in Great Smoky Mountains National Park were surveyed in 1993 as part of a new impact monitoring program. A total of 395 campsites and shelters were located and assessed, including 309 legal campsites located at 84 designated campgrounds, 68 illegal campsites, and 18 shelters. Primary campsite management problems identified by the survey include: (1) campsite proliferation, (2) campsite expansion and excessive size, (3) excessive vegetation loss and soil exposure, (4) lack of visitor solitude at campsites, (5) excessive tree damage, and (6) illegal camping. A number of potential management options are recommended to address the identified campsite management problems. Many problems are linked to the ability of visitors to determine the location and number of individual campsites within each designated campground. A principal recommendation is that managers apply site-selection criteria to existing and potential new campsite locations to identify and designate campsites that will resist and constrain the areal extent of impacts and enhance visitor solitude. Educational solutions are also offered.

  11. Cigarette pack design and adolescent smoking susceptibility: a cross-sectional survey

    PubMed Central

    Ford, Allison; MacKintosh, Anne Marie; Moodie, Crawford; Richardson, Sol; Hastings, Gerard

    2013-01-01

    Objectives To compare adolescents’ responses to three different styles of cigarette packaging: novelty (branded packs designed with a distinctive shape, opening style or bright colour), regular (branded pack with no special design features) and plain (brown pack with a standard shape and opening and all branding removed, aside from brand name). Design Cross-sectional in-home survey. Setting UK. Participants Random location quota sample of 1025 never smokers aged 11–16?years. Main outcome measures Susceptibility to smoking and composite measures of pack appraisal and pack receptivity derived from 11 survey items. Results Mean responses to the three pack types were negative for all survey items. However, ‘novelty’ packs were rated significantly less negatively than the ‘regular’ pack on most items, and the novelty and regular packs were rated less negatively than the ‘plain’ pack. For the novelty packs, logistic regressions, controlling for factors known to influence youth smoking, showed that susceptibility was associated with positive appraisal and also receptivity. For example, those receptive to the innovative Silk Cut Superslims pack were more than four times as likely to be susceptible to smoking than those not receptive to this pack (AOR=4.42, 95% CI 2.50 to 7.81, p<0.001). For the regular pack, an association was found between positive appraisal and susceptibility but not with receptivity and susceptibility. There was no association with pack appraisal or receptivity for the plain pack. Conclusions Pack structure (shape and opening style) and colour are independently associated, not just with appreciation of and receptivity to the pack, but also with susceptibility to smoke. In other words, those who think most highly of novelty cigarette packaging are also the ones who indicate that they are most likely to go on to smoke. Plain packaging, in contrast, was found to directly reduce the appeal of smoking to adolescents. PMID:24056481

  12. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data

    PubMed Central

    Lewis, Jesse S.; Gerber, Brian D.

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km2 of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10–120 cameras) and occasions (20–120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ?) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with low detection (i.e., bobcat and coyote) the most efficient sampling approach was to increase the number of occasions (survey days). However, for common species that are moderately detectable (i.e., cottontail rabbit and mule deer), occupancy could reliably be estimated with comparatively low numbers of cameras over a short sampling period. We provide general guidelines for reliably estimating occupancy across a range of terrestrial species (rare to common: ? = 0.175–0.970, and low to moderate detectability: p = 0.003–0.200) using motion-activated cameras. Wildlife researchers/managers with limited knowledge of the relative abundance and likelihood of detection of a particular species can apply these guidelines regardless of location. We emphasize the importance of prior biological knowledge, defined objectives and detailed planning (e.g., simulating different study-design scenarios) for designing effective monitoring programs and research studies. PMID:25210658

  13. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data.

    PubMed

    Shannon, Graeme; Lewis, Jesse S; Gerber, Brian D

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km(2) of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10-120 cameras) and occasions (20-120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ?) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with low detection (i.e., bobcat and coyote) the most efficient sampling approach was to increase the number of occasions (survey days). However, for common species that are moderately detectable (i.e., cottontail rabbit and mule deer), occupancy could reliably be estimated with comparatively low numbers of cameras over a short sampling period. We provide general guidelines for reliably estimating occupancy across a range of terrestrial species (rare to common: ? = 0.175-0.970, and low to moderate detectability: p = 0.003-0.200) using motion-activated cameras. Wildlife researchers/managers with limited knowledge of the relative abundance and likelihood of detection of a particular species can apply these guidelines regardless of location. We emphasize the importance of prior biological knowledge, defined objectives and detailed planning (e.g., simulating different study-design scenarios) for designing effective monitoring programs and research studies. PMID:25210658

  14. Requirements and concept design for large earth survey telescope for SEOS

    NASA Technical Reports Server (NTRS)

    Mailhot, P.; Bisbee, J.

    1975-01-01

    The efforts of a one year program of Requirements Analysis and Conceptual Design for the Large Earth Survey Telescope for the Synchronous Earth Observatory Satellite is summarized. A 1.4 meter aperture Cassegrain telescope with 0.6 deg field of view is shown to do an excellent job in satisfying the observational requirements for a wide range of earth resources and meteorological applications. The telescope provides imagery or thermal mapping in ten spectral bands at one time in a field sharing grouping of linear detector arrays. Pushbroom scanning is accomplished by spacecraft slew.

  15. Designing, Testing, and Validating an Attitudinal Survey on an Environmental Topic: A Groundwater Pollution Survey Instrument for Secondary School Students

    ERIC Educational Resources Information Center

    Lacosta-Gabari, Idoya; Fernandez-Manzanal, Rosario; Sanchez-Gonzalez, Dolores

    2009-01-01

    Research in environmental attitudes' assessment has significantly increased in recent years. The development of specific attitude scales for specific environmental problems has often been proposed. This paper describes the Groundwater Pollution Test (GPT), a 19-item survey instrument using a Likert-type scale. The survey has been used with…

  16. Developing Design Criteria for Research Facilities (A Report on a Brief Exploratory Study of Approaches to Establishing an Objective, Quantitative Data Base).

    ERIC Educational Resources Information Center

    Tennant, Wesley L.

    A study which was concerned with establishing data and a criteria basis for evaluating and designing research and science facilities is overviewed with suggestions for facility design. The surveying method, organization, and results are all included, revealing statistical information for science facility planners. Problems and misconceptions of…

  17. HIV testing during the Canadian immigration medical examination: a national survey of designated medical practitioners.

    PubMed

    Tran, Jennifer M; Li, Alan; Owino, Maureen; English, Ken; Mascarenhas, Lyndon; Tan, Darrell H S

    2014-01-01

    HIV testing is mandatory for individuals wishing to immigrate to Canada. Since the Designated Medical Practitioners (DMPs) who perform these tests may have varying experience in HIV and time constraints in their clinical practices, there may be variability in the quality of pre- and posttest counseling provided. We surveyed DMPs regarding HIV testing, counseling, and immigration inadmissibility. A 16-item survey was mailed to all DMPs across Canada (N = 203). The survey inquired about DMP characteristics, knowledge of HIV, attitudes and practices regarding inadmissibility and counseling, and interest in continuing medical education. There were a total of 83 respondents (41%). Participants frequently rated their knowledge of HIV diagnostics, cultural competency, and HIV/AIDS service organizations as "fair" (40%, 43%, and 44%, respectively). About 25%, 46%, and 11% of the respondents agreed/strongly agreed with the statements "HIV infected individuals pose a danger to public health and safety," "HIV-positive immigrants cause excessive demand on the healthcare system," and "HIV seropositivity is a reasonable ground for denial into Canada," respectively. Language was cited as a barrier to counseling, which focused on transmission risks (46% discussed this as "always" or "often") more than coping and social support (37%). There was a high level of interest (47%) in continuing medical education in this area. There are areas for improvement regarding DMPs' knowledge, attitudes, and practices about HIV infection, counseling, and immigration criteria. Continuing medical education and support for DMPs to facilitate practice changes could benefit newcomers who test positive through the immigration process. PMID:25029636

  18. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  19. THE DEEP2 GALAXY REDSHIFT SURVEY: DESIGN, OBSERVATIONS, DATA REDUCTION, AND REDSHIFTS

    SciTech Connect

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Harker, Justin J.; Lai, Kamson; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan Renbin; Kassin, Susan A.; Konidaris, N. P. E-mail: djm70@pitt.edu E-mail: mdavis@berkeley.edu E-mail: koo@ucolick.org E-mail: phillips@ucolick.org; and others

    2013-09-15

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z {approx} 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M{sub B} = -20 at z {approx} 1 via {approx}90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg{sup 2} divided into four separate fields observed to a limiting apparent magnitude of R{sub AB} = 24.1. Objects with z {approx}< 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted {approx}2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z {approx} 1.45, where the [O II] 3727 A doublet lies in the infrared. The DEIMOS 1200 line mm{sup -1} grating used for the survey delivers high spectral resolution (R {approx} 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z {approx} 1, approaching {approx}5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z {approx} 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far.

  20. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Astrophysics Data System (ADS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Connolly, A. J.; Kaiser, N.; Kirby, Evan N.; Lemaux, Brian C.; Lin, Lihwai; Lotz, Jennifer M.; Luppino, G. A.; Marinoni, C.; Matthews, Daniel J.; Metevier, Anne; Schiavon, Ricardo P.

    2013-09-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ~ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z ~ 1 via ~90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg2 divided into four separate fields observed to a limiting apparent magnitude of R AB = 24.1. Objects with z <~ 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted ~2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ~ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm-1 grating used for the survey delivers high spectral resolution (R ~ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z ~ 1, approaching ~5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z ~ 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far. Based on observations taken at the W. M. Keck Observatory, which is operated jointly by the University of California and the California Institute of Technology, and on observations made with the NASA/ESO Hubble Space Telescope, obtained from the data archives at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555, and from the Canadian Astronomy Data Centre.

  1. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Technical Reports Server (NTRS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L; Guhathakurta, Puraga; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Wilmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Kirby, Evan N.; Lotz, Jennifer M.

    2013-01-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z approx. 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z approx. 1 via approx.90 nights of observation on the Keck telescope. The survey covers an area of 2.8 Sq. deg divided into four separate fields observed to a limiting apparent magnitude of R(sub AB) = 24.1. Objects with z approx. < 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted approx. 2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z approx. 1.45, where the [O ii] 3727 Ang. doublet lies in the infrared. The DEIMOS 1200 line mm(exp -1) grating used for the survey delivers high spectral resolution (R approx. 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z approx. 1, approaching approx. 5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z approx. 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far.

  2. A Survey to Examine Teachers' Perceptions of Design Dispositions, Lesson Design Practices, and Their Relationships with Technological Pedagogical Content Knowledge (TPACK)

    ERIC Educational Resources Information Center

    Koh, Joyce Hwee Ling; Chai, Ching Sing; Hong, Huang-Yao; Tsai, Chin-Chung

    2015-01-01

    This study investigates 201 Singaporean teachers' perceptions of their technological pedagogical content knowledge (TPACK), lesson design practices, and design dispositions through a survey instrument. Investigation of these constructs reveal important variables influencing teachers' perceptions of TPACK which have not yet been explored. The…

  3. Dealing with trade-offs in destructive sampling designs for occupancy surveys.

    PubMed

    Canessa, Stefano; Heard, Geoffrey W; Robertson, Peter; Sluiter, Ian R K

    2015-01-01

    Occupancy surveys should be designed to minimise false absences. This is commonly achieved by increasing replication or increasing the efficiency of surveys. In the case of destructive sampling designs, in which searches of individual microhabitats represent the repeat surveys, minimising false absences leads to an inherent trade-off. Surveyors can sample more low quality microhabitats, bearing the resultant financial costs and producing wider-spread impacts, or they can target high quality microhabitats were the focal species is more likely to be found and risk more severe impacts on local habitat quality. We show how this trade-off can be solved with a decision-theoretic approach, using the Millewa Skink Hemiergis millewae from southern Australia as a case study. Hemiergis millewae is an endangered reptile that is best detected using destructive sampling of grass hummocks. Within sites that were known to be occupied by H. millewae, logistic regression modelling revealed that lizards were more frequently detected in large hummocks. If this model is an accurate representation of the detection process, searching large hummocks is more efficient and requires less replication, but this strategy also entails destruction of the best microhabitats for the species. We developed an optimisation tool to calculate the minimum combination of the number and size of hummocks to search to achieve a given cumulative probability of detecting the species at a site, incorporating weights to reflect the sensitivity of the results to a surveyor's priorities. The optimisation showed that placing high weight on minimising volume necessitates impractical replication, whereas placing high weight on minimising replication requires searching very large hummocks which are less common and may be vital for H. millewae. While destructive sampling methods are sometimes necessary, surveyors must be conscious of the ecological impacts of these methods. This study provides a simple tool for identifying sampling strategies that minimise those impacts. PMID:25760868

  4. Quantitative Evaluation of Tissue Surface Adaption of CAD-Designed and 3D Printed Wax Pattern of Maxillary Complete Denture

    PubMed Central

    Chen, Hu; Wang, Han; Lv, Peijun; Wang, Yong; Sun, Yuchun

    2015-01-01

    Objective. To quantitatively evaluate the tissue surface adaption of a maxillary complete denture wax pattern produced by CAD and 3DP. Methods. A standard edentulous maxilla plaster cast model was used, for which a wax pattern of complete denture was designed using CAD software developed in our previous study and printed using a 3D wax printer, while another wax pattern was manufactured by the traditional manual method. The cast model and the two wax patterns were scanned in the 3D scanner as “DataModel,” “DataWaxRP,” and “DataWaxManual.” After setting each wax pattern on the plaster cast, the whole model was scanned for registration. After registration, the deviations of tissue surface between “DataModel” and “DataWaxRP” and between “DataModel” and “DataWaxManual” were measured. The data was analyzed by paired t-test. Results. For both wax patterns produced by the CAD&RP method and the manual method, scanning data of tissue surface and cast surface showed a good fit in the majority. No statistically significant (P > 0.05) difference was observed between the CAD&RP method and the manual method. Conclusions. Wax pattern of maxillary complete denture produced by the CAD&3DP method is comparable with traditional manual method in the adaption to the edentulous cast model. PMID:26583108

  5. Survey of alternative gas turbine engine and cycle design. Final report

    SciTech Connect

    Lukas, H.

    1986-02-01

    In the period of the 1940's to 1960's much experimentation was performed in the areas of intercooling, reheat, and recuperation, as well as the use of low-grade fuels in gas turbines. The Electric Power Research Institute (EPRI), in an effort to document past experience which can be used as the basis for current design activities, commissioned a study to document alternate cycles and components used in gas turbine design. The study was performed by obtaining the important technical and operational criteria of the cycles through a literature search of published documents, articles, and papers. Where possible the information was augmented through dialogue with persons associated with those cycles and with the manufacturers. The survey indicated that many different variations of the simple open-cycle gas turbine plant were used. Many of these changes resulted in increases in efficiency over the low simple-cycle efficiency of that period. Metallurgy, as well as compressor and turbine design, limited the simple-cycle efficiency to the upper teens. The cycle modifications increased those efficiencies to the twenties and thirties. Advances in metallurgy as well as compressor and turbine design, coupled with the decrease in flue cost, stopped the development of these complex cycles. Many of the plants operated successfully for many years, and only because newer simple-cycle gas turbine plants and large steam plants had better heat rates were these units shutdown or put into stand-by service. 24 refs., 25 figs., 114 tabs.

  6. Utility FGD survey, January--December 1989. Volume 2, Design performance data for operating FGD systems: Part 2

    SciTech Connect

    Hance, S.L.; McKibben, R.S.; Jones, F.M.

    1992-03-01

    This is Volume 2 part 2, of the Utility flue gas desulfurization (FGD) Survey report, which is generated by a computerized data base management system, represents a survey of operational and planned domestic utility flue gas desulfurization (FGD) systems. It summarizes information contributed by the utility industry, system and equipment suppliers, system designers, research organizations, and regulatory agencies. The data cover system design, fuel characteristics, operating history, and actual system performance. Also included is a unit-by-unit discussion of problems and solutions associated with the boilers, scrubbers, and FGD systems. This volume particularly contains basic design and performance data.

  7. High-Resolution Linkage and Quantitative Trait Locus Mapping Aided by Genome Survey Sequencing: Building Up An Integrative Genomic Framework for a Bivalve Mollusc

    PubMed Central

    Jiao, Wenqian; Fu, Xiaoteng; Dou, Jinzhuang; Li, Hengde; Su, Hailin; Mao, Junxia; Yu, Qian; Zhang, Lingling; Hu, Xiaoli; Huang, Xiaoting; Wang, Yangfan; Wang, Shi; Bao, Zhenmin

    2014-01-01

    Genetic linkage maps are indispensable tools in genetic and genomic studies. Recent development of genotyping-by-sequencing (GBS) methods holds great promise for constructing high-resolution linkage maps in organisms lacking extensive genomic resources. In the present study, linkage mapping was conducted for a bivalve mollusc (Chlamys farreri) using a newly developed GBS method—2b-restriction site-associated DNA (2b-RAD). Genome survey sequencing was performed to generate a preliminary reference genome that was utilized to facilitate linkage and quantitative trait locus (QTL) mapping in C. farreri. A high-resolution linkage map was constructed with a marker density (3806) that has, to our knowledge, never been achieved in any other molluscs. The linkage map covered nearly the whole genome (99.5%) with a resolution of 0.41 cM. QTL mapping and association analysis congruously revealed two growth-related QTLs and one potential sex-determination region. An important candidate QTL gene named PROP1, which functions in the regulation of growth hormone production in vertebrates, was identified from the growth-related QTL region detected on the linkage group LG3. We demonstrate that this linkage map can serve as an important platform for improving genome assembly and unifying multiple genomic resources. Our study, therefore, exemplifies how to build up an integrative genomic framework in a non-model organism. PMID:24107803

  8. Quantitative Assessment of a Senge Learning Organization Intervention

    ERIC Educational Resources Information Center

    Kiedrowski, P. Jay

    2006-01-01

    Purpose: To quantitatively assess a Senge learning organization (LO) intervention to determine if it would result in improved employee satisfaction. Design/methodology/approach: A Senge LO intervention in Division 123 of Company ABC was undertaken in 2000. Three employee surveys using likert-scale questions over five years and correlation analysis…

  9. EVALUATION OF VISUAL SURVEY PROGRAMS FOR MONITORING COHO SALMON ESCAPEMENT IN

    E-print Network

    EVALUATION OF VISUAL SURVEY PROGRAMS FOR MONITORING COHO SALMON ESCAPEMENT IN RELATION Title of Research Project: Evaluation of Visual Survey Programs for Monitoring Coho Salmon Escapement/Approved: ___________________________________________ ii #12;ABSTRACT Canada's Wild Salmon Policy (WSP) requires that quantitative survey designs be used

  10. Technology transfer with system analysis, design, decision making, and impact (Survey-2000) in acute care hospitals in the United States.

    PubMed

    Hatcher, M

    2001-10-01

    This paper provides the results of the Survey-2000 measuring technology transfer for management information systems in health care. The relationships with systems approaches, user involvement, usersatisfaction, and decision-making were measured and are presented. The survey also measured the levels Internet and Intranet presents in acute care hospitals, which will be discussed in future articles. The depth of the survey includes e-commerce for both business to business and customers. These results are compared, where appropriate, with results from survey 1997 and changes are discussed. This information will provide benchmarks for hospitals to plan their network technology position and to set goals. This is the first of three articles based upon the results of the Srvey-2000. Readers are referred to a prior article by the author that discusses the survey design and provides a tutorial on technology transfer in acute care hospitals. PMID:11508906

  11. Changes in depth occupied by Great Lakes lake whitefish populations and the influence of survey design

    USGS Publications Warehouse

    Rennie, Michael D.; Weidel, Brian C.; Claramunt, Randy; Dunlob, Erin S.

    2015-01-01

    Understanding fish habitat use is important in determining conditions that ultimately affect fish energetics, growth and reproduction. Great Lakes lake whitefish (Coregonus clupeaformis) have demonstrated dramatic changes in growth and life history traits since the appearance of dreissenid mussels in the Great Lakes, but the role of habitat occupancy in driving these changes is poorly understood. To better understand temporal changes in lake whitefish depth of capture (Dw), we compiled a database of fishery-independent surveys representing multiple populations across all five Laurentian Great Lakes. By demonstrating the importance of survey design in estimating Dw, we describe a novel method for detecting survey-based bias in Dw and removing potentially biased data. Using unbiased Dw estimates, we show clear differences in the pattern and timing of changes in lake whitefish Dw between our reference sites (Lake Superior) and those that have experienced significant benthic food web changes (lakes Michigan, Huron, Erie and Ontario). Lake whitefish Dw in Lake Superior tended to gradually shift to shallower waters, but changed rapidly in other locations coincident with dreissenid establishment and declines in Diporeia densities. Almost all lake whitefish populations that were exposed to dreissenids demonstrated deeper Dw following benthic food web change, though a subset of these populations subsequently shifted to more shallow depths. In some cases in lakes Huron and Ontario, shifts towards more shallow Dw are occurring well after documented Diporeia collapse, suggesting the role of other drivers such as habitat availability or reliance on alternative prey sources.

  12. Optical design concept of the 4-m visible and infrared survey telescope for astronomy

    NASA Astrophysics Data System (ADS)

    Atad-Ettedgui, Eli; Worswick, Susan P.

    2003-02-01

    This paper describes the optical design of VISTA (Visible and Infrared Survey Telescope for Astronomy) in the infrared and visible configurations. The design is based on a fast quasi Ritchey-Chretien Telescope with an f/1 primary and an f/3 secondary. The large field of views available: 1.65 degrees in the IR and 2.1 degrees in the visible, makes use of the latest technology in optical materials, active optics and large arrays of detectors. The residual third order spherical aberration for on-axis images introduced in the two-mirror design is used to compensate the residual spherical aberration in the field corrector lenses. The infrared camera is included in the telescope optimization, letting the radii of curvature and conic constants of the two mirrors in the telescope vary in order to get the best performance across the entire IR detector array. It also contains an innovative cold baffle with a special black coating. The visible camera contains an ADC incorporated in the field corrector lenses. The acquisition, guiding and wavefront sensing of this telescope is integrated in the instruments.

  13. Design and analysis of classifier learning experiments in bioinformatics: survey and case studies.

    PubMed

    Irsoy, Ozan; Yildiz, Olcay Taner; Alpaydin, Ethem

    2012-01-01

    In many bioinformatics applications, it is important to assess and compare the performances of algorithms trained from data, to be able to draw conclusions unaffected by chance and are therefore significant. Both the design of such experiments and the analysis of the resulting data using statistical tests should be done carefully for the results to carry significance. In this paper, we first review the performance measures used in classification, the basics of experiment design and statistical tests. We then give the results of our survey over 1,500 papers published in the last two years in three bioinformatics journals (including this one). Although the basics of experiment design are well understood, such as resampling instead of using a single training set and the use of different performance metrics instead of error, only 21 percent of the papers use any statistical test for comparison. In the third part, we analyze four different scenarios which we encounter frequently in the bioinformatics literature, discussing the proper statistical methodology as well as showing an example case study for each. With the supplementary software, we hope that the guidelines we discuss will play an important role in future studies. PMID:22908127

  14. SURVEY DESIGN FOR SPECTRAL ENERGY DISTRIBUTION FITTING: A FISHER MATRIX APPROACH

    SciTech Connect

    Acquaviva, Viviana; Gawiser, Eric; Bickerton, Steven J.; Grogin, Norman A.; Guo Yicheng; Lee, Seong-Kook

    2012-04-10

    The spectral energy distribution (SED) of a galaxy contains information on the galaxy's physical properties, and multi-wavelength observations are needed in order to measure these properties via SED fitting. In planning these surveys, optimization of the resources is essential. The Fisher Matrix (FM) formalism can be used to quickly determine the best possible experimental setup to achieve the desired constraints on the SED-fitting parameters. However, because it relies on the assumption of a Gaussian likelihood function, it is in general less accurate than other slower techniques that reconstruct the probability distribution function (PDF) from the direct comparison between models and data. We compare the uncertainties on SED-fitting parameters predicted by the FM to the ones obtained using the more thorough PDF-fitting techniques. We use both simulated spectra and real data, and consider a large variety of target galaxies differing in redshift, mass, age, star formation history, dust content, and wavelength coverage. We find that the uncertainties reported by the two methods agree within a factor of two in the vast majority ({approx}90%) of cases. If the age determination is uncertain, the top-hat prior in age used in PDF fitting to prevent each galaxy from being older than the universe needs to be incorporated in the FM, at least approximately, before the two methods can be properly compared. We conclude that the FM is a useful tool for astronomical survey design.

  15. A survey of pulse shape options for a revised plastic ablator ignition design

    SciTech Connect

    Clark, D. S.; Milovich, J. L.; Hinkel, D. E.; Salmonson, J. D.; Peterson, J. L.; Berzak Hopkins, L. F.; Eder, D. C.; Haan, S. W.; Jones, O. S.; Marinak, M. M.; Robey, H. F.; Smalyuk, V. A.; Weber, C. R.

    2014-11-15

    Recent experimental results using the “high foot” pulse shape for inertial confinement fusion ignition experiments on the National Ignition Facility (NIF) [Moses et al., Phys. Plasmas 16, 041006 (2009)] have shown encouraging progress compared to earlier “low foot” experiments. These results strongly suggest that controlling ablation front instability growth can significantly improve implosion performance even in the presence of persistent, large, low-mode distortions. Simultaneously, hydrodynamic growth radiography experiments have confirmed that ablation front instability growth is being modeled fairly well in NIF experiments. It is timely then to combine these two results and ask how current ignition pulse shapes could be modified to improve one-dimensional implosion performance while maintaining the stability properties demonstrated with the high foot. This paper presents such a survey of pulse shapes intermediate between the low and high foot extremes in search of an intermediate foot optimum. Of the design space surveyed, it is found that a higher picket version of the low foot pulse shape shows the most promise for improved compression without loss of stability.

  16. Design of optical frequency comb for sky survey astronomical spectrograph calibration

    NASA Astrophysics Data System (ADS)

    Hu, Yao; Wang, Xiang

    2013-12-01

    Sky survey telescope is an important approach to ground-based observation of external galaxies, and further research on large-scale structure of the universe, galaxy formation and evolution. Sky survey spectrograph (SSS) with low resolution is included in such kind of telescope system. The spectral measurement accuracy of SSS will determine the accuracy and scientific value of mass spectral data. Currently iodine absorption cell or Thorium-Argon lamp is adopted as the calibration source for SSS. However, the spectral lines are sparse, with non-uniform spectral interval and intensity, and even instable over long time. The novel astro-comb cannot be applied to SSS directly because the spectral intervals are still too dense to be used in SSS with relatively lower resolution. In this paper, spectral mode filtering method with acceptable energy reduction and accurate spectral line frequency is studied to improve current astro-comb to properly distributed spectral lines and solve the above critical problem. Aiming at calibration for the measuring of the spectral lines in 3700-5900 Å region, we design an improved astro-comb system based on Erbium-doped fiber laser and Fabry-Perot filter series. Feasible systematical parameters are given. It will help develop a novel calibration approach with systematic error reduction to less than 1/10000 of that of the current calibration methods.

  17. A survey of pulse shape options for a revised plastic ablator ignition design

    NASA Astrophysics Data System (ADS)

    Clark, Daniel; Eder, David; Haan, Steven; Hinkel, Denise; Jones, Ogden; Marinak, Michael; Milovich, Jose; Peterson, Jayson; Robey, Harold; Salmonson, Jay; Smalyuk, Vladimir; Weber, Christopher

    2014-10-01

    Recent experimental results using the ``high foot'' pulse shape on the National Ignition Facility (NIF) have shown encouraging progress compared to earlier ``low foot'' experiments. These results strongly suggest that controlling ablation front instability growth can dramatically improve implosion performance, even in the presence of persistent, large, low-mode distortions. In parallel, Hydro. Growth Radiography experiments have so far validated the techniques used for modeling ablation front growth in NIF experiments. It is timely then to combine these two results and ask how current ignition pulse shapes could be modified so as to improve implosion performance, namely fuel compressibility, while maintaining the stability properties demonstrated with the high foot. This talk presents a survey of pulse shapes intermediate between the low and high foot extremes in search of a more optimal design. From the database of pulse shapes surveyed, a higher picket version of the original low foot pulse shape shows the most promise for improved compression without loss of stability. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  18. Developing an efficient modelling and data presentation strategy for ATDEM system comparison and survey design

    NASA Astrophysics Data System (ADS)

    Combrinck, Magdel

    2015-10-01

    Forward modelling of airborne time-domain electromagnetic (ATDEM) responses is frequently used to compare systems and design surveys for optimum detection of expected mineral exploration targets. It is a challenging exercise to display and analyse the forward modelled responses due to the large amount of data generated for three dimensional models as well as the system dependent nature of the data. I propose simplifying the display of ATDEM responses through using the dimensionless quantity of signal-to-noise ratios (signal:noise) instead of respective system units. I also introduce the concept of a three-dimensional signal:noise nomo-volume as an efficient tool to visually present and analyse large amounts of data. The signal:noise nomo-volume is a logical extension of the two-dimensional conductance nomogram. It contains the signal:noise values of all system time channels and components for various target depths and conductances integrated into a single interactive three-dimensional image. Responses are calculated over a complete survey grid and therefore include effects of system and target geometries. The user can interactively select signal:noise cut-off values on the nomo-volume and is able to perform visual comparisons between various system and target responses. The process is easy to apply and geophysicists with access to forward modelling airborne electromagnetic (AEM) and three-dimensional imaging software already possess the tools required to produce and analyse signal:noise nomo-volumes.

  19. Designing Anti-Influenza Aptamers: Novel Quantitative Structure Activity Relationship Approach Gives Insights into Aptamer – Virus Interaction

    PubMed Central

    Musafia, Boaz; Oren-Banaroya, Rony; Noiman, Silvia

    2014-01-01

    This study describes the development of aptamers as a therapy against influenza virus infection. Aptamers are oligonucleotides (like ssDNA or RNA) that are capable of binding to a variety of molecular targets with high affinity and specificity. We have studied the ssDNA aptamer BV02, which was designed to inhibit influenza infection by targeting the hemagglutinin viral protein, a protein that facilitates the first stage of the virus’ infection. While testing other aptamers and during lead optimization, we realized that the dominant characteristics that determine the aptamer’s binding to the influenza virus may not necessarily be sequence-specific, as with other known aptamers, but rather depend on general 2D structural motifs. We adopted QSAR (quantitative structure activity relationship) tool and developed computational algorithm that correlate six calculated structural and physicochemical properties to the aptamers’ binding affinity to the virus. The QSAR study provided us with a predictive tool of the binding potential of an aptamer to the influenza virus. The correlation between the calculated and actual binding was R2?=?0.702 for the training set, and R2?=?0.66 for the independent test set. Moreover, in the test set the model’s sensitivity was 89%, and the specificity was 87%, in selecting aptamers with enhanced viral binding. The most important properties that positively correlated with the aptamer’s binding were the aptamer length, 2D-loops and repeating sequences of C nucleotides. Based on the structure-activity study, we have managed to produce aptamers having viral affinity that was more than 20 times higher than that of the original BV02 aptamer. Further testing of influenza infection in cell culture and animal models yielded aptamers with 10 to 15 times greater anti-viral activity than the BV02 aptamer. Our insights concerning the mechanism of action and the structural and physicochemical properties that govern the interaction with the influenza virus are discussed. PMID:24846127

  20. Designing HIGH-COST medicine: hospital surveys, health planning, and the paradox of progressive reform.

    PubMed

    Perkins, Barbara Bridgman

    2010-02-01

    Inspired by social medicine, some progressive US health reforms have paradoxically reinforced a business model of high-cost medical delivery that does not match social needs. In analyzing the financial status of their areas' hospitals, for example, city-wide hospital surveys of the 1910s through 1930s sought to direct capital investments and, in so doing, control competition and markets. The 2 national health planning programs that ran from the mid-1960s to the mid-1980s continued similar strategies of economic organization and management, as did the so-called market reforms that followed. Consequently, these reforms promoted large, extremely specialized, capital-intensive institutions and systems at the expense of less complex (and less costly) primary and chronic care. The current capital crisis may expose the lack of sustainability of such a model and open up new ideas and new ways to build health care designed to meet people's health needs. PMID:20019312

  1. Designing HIGH-COST Medicine Hospital Surveys, Health Planning, and the Paradox of Progressive Reform

    PubMed Central

    2010-01-01

    Inspired by social medicine, some progressive US health reforms have paradoxically reinforced a business model of high-cost medical delivery that does not match social needs. In analyzing the financial status of their areas’ hospitals, for example, city-wide hospital surveys of the 1910s through 1930s sought to direct capital investments and, in so doing, control competition and markets. The 2 national health planning programs that ran from the mid-1960s to the mid-1980s continued similar strategies of economic organization and management, as did the so-called market reforms that followed. Consequently, these reforms promoted large, extremely specialized, capital-intensive institutions and systems at the expense of less complex (and less costly) primary and chronic care. The current capital crisis may expose the lack of sustainability of such a model and open up new ideas and new ways to build health care designed to meet people's health needs. PMID:20019312

  2. Hot rocket plume experiment - Survey and conceptual design. [of rhenium-iridium bipropellants

    NASA Technical Reports Server (NTRS)

    Millard, Jerry M.; Luan, Taylor W.; Dowdy, Mack W.

    1992-01-01

    Attention is given to a space-borne engine plume experiment study to fly an experiment which will both verify and quantify the reduced contamination from advanced rhenium-iridium earth-storable bipropellant rockets (hot rockets) and provide a correlation between high-fidelity, in-space measurements and theoretical plume and surface contamination models. The experiment conceptual design is based on survey results from plume and contamination technologists throughout the U.S. With respect to shuttle use, cursory investigations validate Hitchhiker availability and adaptability, adequate remote manipulator system (RMS) articulation and dynamic capability, acceptable RMS attachment capability, adequate power and telemetry capability, and adequate flight altitude and attitude/orbital capability.

  3. Quantitative radiography

    SciTech Connect

    Logan, C.M.; Hernandez, J.M.; Devine, G.J.

    1991-02-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernable by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudocolor images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. Images are captured using DuPont NDT55 industrial x-ray film in Daypack{trademark} packages. X-ray cabinets are of custom design, with helium flight path and a filter wheel for positioning filters if desired. The cabinets contain baffles to reduce scattered radiation and are equipped with drawer for rapid load/unload of parts. Separate units with tungsten-anode or copper-anode tubes are available. The usual operating voltage is 15 to 35 kVp. Fixturing provides for rough part positioning and precise alignment with respect to the x-ray source. Areal density standards are placed at several locations on each film. In interpreting the image, we use the standards nearest the image of the part being quantified. Because of this, small variations in x-ray flux uniformity (heel effects) are unimportant. The usual standard is a step wedge of aluminum containing 13 steps. Films are permanently labeled by imaging a perforated metal numbering strip. Data such as part number, step wedge identification, etc. are read from barcode labels and transferred to a data base for later retrieval and use in quantifying the image.

  4. Measuring Coverage in MNCH: Design, Implementation, and Interpretation Challenges Associated with Tracking Vaccination Coverage Using Household Surveys

    PubMed Central

    Cutts, Felicity T.; Izurieta, Hector S.; Rhoda, Dale A.

    2013-01-01

    Vaccination coverage is an important public health indicator that is measured using administrative reports and/or surveys. The measurement of vaccination coverage in low- and middle-income countries using surveys is susceptible to numerous challenges. These challenges include selection bias and information bias, which cannot be solved by increasing the sample size, and the precision of the coverage estimate, which is determined by the survey sample size and sampling method. Selection bias can result from an inaccurate sampling frame or inappropriate field procedures and, since populations likely to be missed in a vaccination coverage survey are also likely to be missed by vaccination teams, most often inflates coverage estimates. Importantly, the large multi-purpose household surveys that are often used to measure vaccination coverage have invested substantial effort to reduce selection bias. Information bias occurs when a child's vaccination status is misclassified due to mistakes on his or her vaccination record, in data transcription, in the way survey questions are presented, or in the guardian's recall of vaccination for children without a written record. There has been substantial reliance on the guardian's recall in recent surveys, and, worryingly, information bias may become more likely in the future as immunization schedules become more complex and variable. Finally, some surveys assess immunity directly using serological assays. Sero-surveys are important for assessing public health risk, but currently are unable to validate coverage estimates directly. To improve vaccination coverage estimates based on surveys, we recommend that recording tools and practices should be improved and that surveys should incorporate best practices for design, implementation, and analysis. PMID:23667334

  5. Mechanical Design of NESSI: New Mexico Tech Extrasolar Spectroscopic Survey Instrument

    NASA Technical Reports Server (NTRS)

    Santoro, Fernando G.; Olivares, Andres M.; Salcido, Christopher D.; Jimenez, Stephen R.; Jurgenson, Colby A.; Hrynevych, Michael A.; Creech-Eakman, Michelle J.; Boston, Penny J.; Schmidt, Luke M.; Bloemhard, Heather; Rodeheffer, Dan; Vaive, Genevieve; Vasisht, Gautam; Swain, Mark R.; Deroo, Pieter

    2011-01-01

    NESSI: the New Mexico Tech Extrasolar Spectroscopic Survey Instrument is a ground-based multi-object spectrograph that operates in the near-infrared. It will be installed on one of the Nasmyth ports of the Magdalena Ridge Observatory (MRO) 2.4-meter Telescope sited in the Magdalena Mountains, about 48 km west of Socorro-NM. NESSI operates stationary to the telescope fork so as not to produce differential flexure between internal opto-mechanical components during or between observations. An appropriate mechanical design allows the instrument alignment to be highly repeatable and stable for both short and long observation timescales, within a wide-range of temperature variation. NESSI is optically composed of a field lens, a field de-rotator, re-imaging optics, an auto-guider and a Dewar spectrograph that operates at LN2 temperature. In this paper we report on NESSI's detailed mechanical and opto-mechanical design, and the planning for mechanical construction, assembly, integration and verification.

  6. Design of Gravity Survey Network using Fractal Theory to Delineate Hydrocarbon bearing Jabera Structure, Vindhyan Basin, Central India

    NASA Astrophysics Data System (ADS)

    Dimri, V. P.; Srivastava, R. P.; Vedanti, N.

    2006-12-01

    A gravity survey network was designed using fractal dimension analysis to delineate a domal structure (Jabera dome) reported in southeastern part of the Vindhyan basin, Central India. This area is also regarded as a `high risk-high reward' frontier area for hydrocarbon exploration in previous studies, hence our aim was to delineate shape and lateral extent of the reported domal structure. Based on the synthetic grid, designed using the concept of fractal dimension, gravity data is collected in Jabera-Damoh area of Vindhyan basin. The collected data is random, but the data density is significant, hence the data points are sorted in a way so that they are close to the synthetic grid points of given grid interval. After sorting the data, again the fractal dimension analysis using box counting method has been carried out to avoid the aliasing in the data due to interpolation and also to know the optimum number of data points sufficient for desired quality of Bouguer anomaly maps. Optimization of number of stations takes care of time and cost involved in the survey and the detectibility limit ensures that the data collected is good enough to resolve the target body under study. The fractal dimension analysis gives clue to select these parameters. It showed that it is always preferable to have well distributed station locations instead of clustering the observation points at some geologically known feature because clustering of data points below required station spacing is not going to add much information where as equally distributed observation points add the information. The study area lies in a difficult terrain of Vindhayn basin, hence according to the accessibility, fractal dimension analysis of the real data sorted approximately at regular grid intervals on 2,3, and 4 km has been done and using the concept of optimum gridding interval Bouguer anomaly maps of the region are prepared. The preliminary depth values of the major interfaces in the area were obtained from the 2D scaling spectral analysis of the data. Results of the scaling spectral method reveals that in the study area, there are three main depth interfaces at 5 km, 1.5 km and 0.8 km respectively which corresponds to the basement and lower Vindhyan interface, lower Vindhyan and Upper Vindhyan interface and upper most is the terrain clearance. For quantitative interpretation, we selected a profile across the target structure (reported as Jabera dome) and modeling of the gravity data acquired along the profile was carried out using Marquardt inversion approach. This profile is selected in order to estimate the tentative geological cross section across the conspicuous low gravity anomaly observed in the southern part of the study area. Deep Seismic Sounding (DSS) studies carried out by earlier workers indicated presence thick sediments in this part of the Vindhyan basin. The gravity anomaly drawn along this profile shows a typical anomaly pattern of a sedimentary basin faulted on its both margins. The modeling results show that the anomaly corresponds to a deep faulted basin in the crystalline basement in which the upper layer with density value of 2.46 g/cc corresponds to the upper Vindhyan rocks. This layer is underlain by a thick layer (1.0 to 6.5 km) of lower Vindhyan sediments. This layer which has gentle slope from NW to SE direction sits over the high density rocks comprising of Bijawar/Mahakoshal group.

  7. Optimal design of a lagrangian observing system for hydrodynamic surveys in coastal areas

    NASA Astrophysics Data System (ADS)

    Cucco, Andrea; Quattrocchi, Giovanni; Antognarelli, Fabio; Satta, Andrea; Maicu, Francesco; Ferrarin, Christian; Umgiesser, Georg

    2014-05-01

    The optimization of ocean observing systems is a pressing need for scientific research. In particular, the improvement of ocean short-term observing networks is achievable by reducing the cost-benefit ratio of the field campaigns and by increasing the quality of measurements. Numerical modeling is a powerful tool for determining the appropriateness of a specific observing system and for optimizing the sampling design. This is particularly true when observations are carried out in coastal areas and lagoons where, the use satellites is prohibitive due to the water shallowness. For such areas, numerical models are the most efficient tool both to provide a preliminary assess of the local physical environment and to make short -term predictions above its change. In this context, a test case experiment was carried out within an enclosed shallow water areas, the Cabras Lagoon (Sardinia, Italy). The aim of the experiment was to explore the optimal design for a field survey based on the use of coastal lagrangian buoys. A three-dimensional hydrodynamic model based on the finite element method (SHYFEM3D, Umgiesser et al., 2004) was implemented to simulate the lagoon water circulation. The model domain extent to the whole Cabras lagoon and to the whole Oristano Gulf, including the surrounding coastal area. Lateral open boundary conditions were provided by the operational ocean model system WMED and only wind forcing, provided by SKIRON atmospheric model (Kallos et al., 1997), was considered as surface boundary conditions. The model was applied to provide a number of ad hoc scenarios and to explore the efficiency of the short-term hydrodynamic survey. A first field campaign was carried out to investigate the lagrangian circulation inside the lagoon under the main wind forcing condition (Mistral wind from North-West). The trajectories followed by the lagrangian buoys and the estimated lagrangian velocities were used to calibrate the model parameters and to validate the simulation results. A set of calibration runs were performed and the model accuracy in reproducing the surface circulation were defined. Therefore, a numerical simulation was conducted to predict the wind induced lagoon water circulation and the paths followed by numerical particles inside the lagoon domain. The simulated particles paths was analyzed and the optimal configuration for the buoys deployment was designed in real-time. The selected deployment geometry was then tested during a further field campaign. The obtained dataset revealed that the chosen measurement strategy provided a near-synoptic survey with the longest records for the considered specific observing experiment. This work is aimed to emphasize the mutual usefulness of observations and numerical simulations in coastal ocean applications and it proposes an efficient approach to harmonize different expertise toward the investigation of a given specific research issue. A Cucco, M Sinerchia, A Ribotti, A Olita, L Fazioli, A Perilli, B Sorgente, M Borghini, K Schroeder, R Sorgente. 2012. A high-resolution real-time forecasting system for predicting the fate of oil spills in the Strait of Bonifacio (western Mediterranean Sea). Marine Pollution Bulletin. 64. 6, 1186-1200. Kallos, G., Nickovic, S., Papadopoulos, A., Jovic, D., Kakaliagou, O., Misirlis, N., Boukas, L., Mimikou, N., G., S., J., P., Anadranistakis, E., and Manousakis, M.. 1997. The regional weather forecasting system Skiron: An overview, in: Proceedings of the Symposium on Regional Weather Prediction on Parallel Computer Environments, 109-122, Athens, Greece. Umgiesser, G., Melaku Canu, D., Cucco, A., Solidoro, C., 2004. A finite element model for the Venice Lagoon. Development, set up, calibration and validation. Journal of Marine Systems 51, 123-145.

  8. CONDITION ASSESSMENT FOR THE ESCAMBIA RIVER, FL, WATERSHED: BENTHIC MACROINVERTEBRATE SURVEYS USING A PROBABILISTIC SAMPLING DESIGN (POSTER SESSION)

    EPA Science Inventory

    Probabilistic sampling has been used to assess the condition of estuarine ecosystems, and the use of this survey design approach was examined for a northwest Florida watershed. Twenty-eight lotic sites within the Escambia River, Florida, watershed were randomly selected and visit...

  9. Technology Survey and Performance Scaling for the Design of High Power Nuclear Electric Power and Propulsion Systems

    E-print Network

    1 Technology Survey and Performance Scaling for the Design of High Power Nuclear Electric Power OF HIGH POWER NUCLEAR ELECTRIC POWER AND PROPULSION SYSTEMS by Daniel B. White Jr. Submitted for the degree of Doctor of Philosophy in Aeronautics and Astronautics ABSTRACT High power nuclear electric

  10. Design, Data Collection, Interview Timing, and Data Editing in the 1995 National Household Education Survey (NHES:95). Working Paper Series.

    ERIC Educational Resources Information Center

    Collins, Mary A.; Brick, J. Michael; Loomis, Laura S.; Nicchitta, Patricia G.; Fleischman, Susan

    The National Household Education Survey (NHES) is a data collection effort of the National Center for Education Statistics that collects and publishes data on the condition of education in the United States. The NHES is designed to provide information on issues that are best addressed by contacting households rather than institutions. It is a…

  11. Final Design of the CARMENES M-Dwarf Radial-Velocity Survey Instrument

    NASA Astrophysics Data System (ADS)

    Quirrenbach, Andreas; Amado, P.; Seifert, W.; Sánchez Carrasco, M. A.; Ribas, I.; Reiners, A.; Mandel, H.; Caballero, J. A.; Mundt, R.; Galadí, D.; Consortium, CARMENES

    2013-01-01

    CARMENES (Calar Alto high-Resolution search for M dwarfs with Exo-earths with Near-infrared and optical Echelle Spectrographs) is a next-generation instrument being built for the 3.5m telescope at the Calar Alto Observatory by a consortium of eleven Spanish and German institutions. CARMENES will conduct a five-year exoplanet survey targeting ~300 M dwarfs. The CARMENES instrument consists of two separate échelle spectrographs covering the wavelength range from 0.55 to 1.7 ?m at a spectral resolution of R = 82,000, fed by fibers from the Cassegrain focus of the telescope. For late-M spectral types, the wavelength range around 1.0 ?m (Y band) is the most important wavelength region for radial velocity work. Therefore, the efficiency of CARMENES will be optimized in this range. Since CCDs do not provide high enough efficiency around 1.0 ?m and no signal at all beyond the Si cutoff at 1.1 ?m, a near-IR detector is required. It is thus natural to adopt an instrument concept with two spectrographs, one equipped with a CCD for the range 0.55-1.05 ?m, and one with HgCdTe detectors for the range from 0.9-1.7 ?m. Each spectrograph will be coupled to the 3.5m telescope with its own optical fiber. The front end will contain a dichroic beam splitter and an atmospheric dispersion corrector, to feed the light into the fibers leading to the spectrographs. Guiding is performed with a separate camera. Additional fibers are available for simultaneous injection of light from emission line lamps for RV calibration. The spectrographs are mounted on benches inside vacuum tanks located in the coudé laboratory of the 3.5m dome. Each vacuum tank is equipped with a temperature stabilization system capable of keeping the temperature constant to within ±0.01K over 24h. The visible-light spectrograph will be operated near room temperature, the NIR spectrograph will be cooled to 140K. The CARMENES instrument passed its preliminary design review in July 2011; the final design is just being completed. Commissioning of the instrument is planned for the first half of 2014. At least 600 useable nights have been allocated at the Calar Alto 3.5m Telescope for the CARMENES survey in the time frame from 2014 to 2018.

  12. A two-phase sampling design for increasing detections of rare species in occupancy surveys

    USGS Publications Warehouse

    Pacifici, Krishna; Dorazio, Robert M.; Dorazio, Michael J.

    2012-01-01

    1. Occupancy estimation is a commonly used tool in ecological studies owing to the ease at which data can be collected and the large spatial extent that can be covered. One major obstacle to using an occupancy-based approach is the complications associated with designing and implementing an efficient survey. These logistical challenges become magnified when working with rare species when effort can be wasted in areas with none or very few individuals. 2. Here, we develop a two-phase sampling approach that mitigates these problems by using a design that places more effort in areas with higher predicted probability of occurrence. We compare our new sampling design to traditional single-season occupancy estimation under a range of conditions and population characteristics. We develop an intuitive measure of predictive error to compare the two approaches and use simulations to assess the relative accuracy of each approach. 3. Our two-phase approach exhibited lower predictive error rates compared to the traditional single-season approach in highly spatially correlated environments. The difference was greatest when detection probability was high (0·75) regardless of the habitat or sample size. When the true occupancy rate was below 0·4 (0·05-0·4), we found that allocating 25% of the sample to the first phase resulted in the lowest error rates. 4. In the majority of scenarios, the two-phase approach showed lower error rates compared to the traditional single-season approach suggesting our new approach is fairly robust to a broad range of conditions and design factors and merits use under a wide variety of settings. 5. Synthesis and applications. Conservation and management of rare species are a challenging task facing natural resource managers. It is critical for studies involving rare species to efficiently allocate effort and resources as they are usually of a finite nature. We believe our approach provides a framework for optimal allocation of effort while maximizing the information content of the data in an attempt to provide the highest conservation value per unit of effort.

  13. SIS Mixer Design for a Broadband Millimeter Spectrometer Suitable for Rapid Line Surveys and Redshift Determinations

    NASA Technical Reports Server (NTRS)

    Rice, F.; Sumner, M.; Zmuidzinas, J.; Hu, R.; LeDuc, H.; Harris, A.; Miller, D.

    2004-01-01

    We present some detail of the waveguide probe and SIS mixer chip designs for a low-noise 180-300 GHz double- sideband receiver with an instantaneous RF bandwidth of 24 GHz. The receiver's single SIS junction is excited by a broadband, fixed-tuned waveguide probe on a silicon substrate. The IF output is coupled to a 6-18 GHz MMIC low- noise preamplifier. Following further amplification, the output is processed by an array of 4 GHz, 128-channel analog autocorrelation spectrometers (WASP 11). The single-sideband receiver noise temperature goal of 70 Kelvin will provide a prototype instrument capable of rapid line surveys and of relatively efficient carbon monoxide (CO) emission line searches of distant, dusty galaxies. The latter application's goal is to determine redshifts by measuring the frequencies of CO line emissions from the star-forming regions dominating the submillimeter brightness of these galaxies. Construction of the receiver has begun; lab testing should begin in the fall. Demonstration of the receiver on the Caltech Submillimeter Observatory (CSO) telescope should begin in spring 2003.

  14. Quantitative Imaging Network

    Cancer.gov

    The QIN Imaging Network is designed to promote research and development of quantitative imaging methods for the measurement of tumor response to therapies in clinical trial settings, with the overall goal of facilitating clinical decision-making. The

  15. Is the linear modeling technique good enough for optimal form design? A comparison of quantitative analysis models.

    PubMed

    Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun

    2012-01-01

    How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process. PMID:23258961

  16. Design Evolution of the Wide Field Infrared Survey Telescope Using Astrophysics Focused Telescope Assets (WFIRST-AFTA) and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Peabody, Hume L.; Peters, Carlton V.; Rodriguez-Ruiz, Juan E.; McDonald, Carson S.; Content, David A.; Jackson, Clifton E.

    2015-01-01

    The design of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) continues to evolve as each design cycle is analyzed. In 2012, two Hubble sized (2.4 m diameter) telescopes were donated to NASA from elsewhere in the Federal Government. NASA began investigating potential uses for these telescopes and identified WFIRST as a mission to benefit from these assets. With an updated, deeper, and sharper field of view than previous design iterations with a smaller telescope, the optical designs of the WFIRST instruments were updated and the mechanical and thermal designs evolved around the new optical layout. Beginning with Design Cycle 3, significant analysis efforts yielded a design and model that could be evaluated for Structural-Thermal-Optical-Performance (STOP) purposes for the Wide Field Imager (WFI) and provided the basis for evaluating the high level observatory requirements. Development of the Cycle 3 thermal model provided some valuable analysis lessons learned and established best practices for future design cycles. However, the Cycle 3 design did include some major liens and evolving requirements which were addressed in the Cycle 4 Design. Some of the design changes are driven by requirements changes, while others are optimizations or solutions to liens from previous cycles. Again in Cycle 4, STOP analysis was performed and further insights into the overall design were gained leading to the Cycle 5 design effort currently underway. This paper seeks to capture the thermal design evolution, with focus on major design drivers, key decisions and their rationale, and lessons learned as the design evolved.

  17. Design Evolution of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Peabody, Hume; Peters, Carlton; Rodriguez, Juan; McDonald, Carson; Content, David A.; Jackson, Cliff

    2015-01-01

    The design of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) continues to evolve as each design cycle is analyzed. In 2012, two Hubble sized (2.4 m diameter) telescopes were donated to NASA from elsewhere in the Federal Government. NASA began investigating potential uses for these telescopes and identified WFIRST as a mission to benefit from these assets. With an updated, deeper, and sharper field of view than previous design iterations with a smaller telescope, the optical designs of the WFIRST instruments were updated and the mechanical and thermal designs evolved around the new optical layout. Beginning with Design Cycle 3, significant analysis efforts yielded a design and model that could be evaluated for Structural-Thermal-Optical-Performance (STOP) purposes for the Wide Field Imager (WFI) and provided the basis for evaluating the high level observatory requirements. Development of the Cycle 3 thermal model provided some valuable analysis lessons learned and established best practices for future design cycles. However, the Cycle 3 design did include some major liens and evolving requirements which were addressed in the Cycle 4 Design. Some of the design changes are driven by requirements changes, while others are optimizations or solutions to liens from previous cycles. Again in Cycle 4, STOP analysis was performed and further insights into the overall design were gained leading to the Cycle 5 design effort currently underway. This paper seeks to capture the thermal design evolution, with focus on major design drivers, key decisions and their rationale, and lessons learned as the design evolved.

  18. A survey of Utah's public secondary education science teachers to determine their feelings of preparedness to teach engineering design

    NASA Astrophysics Data System (ADS)

    Ames, R. Tyler

    The Next Generation Science Standards were released in 2013 and call for the inclusion of engineering design into the science classroom. This integration of science and engineering is very exciting for many people and groups in both fields involved, but a good bit of uncertainty remains about how prepared science teachers feel to teach engineering design. This study analyzes the history of science standards leading up to the Next Generation Science Standards, establishes key components of the engineering design, and lays the background for the study detailed in this report. A survey was given to several hundred public secondary science teachers in the state of Utah in which respondents were asked to report their feelings of preparedness on several aspects of engineering design. The findings of the study show that Utah teachers do not feel fully prepared to teach engineering design at the present time (2014).

  19. National Aquatic Resource Surveys: Multiple objectives and constraints lead to design complexity

    EPA Science Inventory

    The US Environmental Protection Agency began conducting the National Aquatic resource Surveys (NARS) in 2007 with a national survey of lakes (NLA 2007) followed by rivers and streams in 2008-9 (NRSA 2008), coastal waters in 2010 (NCCA 2010) and wetlands in 2011 (NWCA). The surve...

  20. Flexibility by Design: How mobile GIS meets the needs of archaeological survey

    E-print Network

    Tripcevich, Nicholas

    2004-01-01

    Flexibility by Design: How Mobile GIS Meets the Needs offlexibility in interface design for mobile GIS. Implicationsmobile GIS to their field research settings will revolve around the issues of flexible data ac- quisition, reliable designs,

  1. Design of a Mars Airplane Propulsion System for the Aerial Regional-Scale Environmental Survey (ARES) Mission Concept

    NASA Technical Reports Server (NTRS)

    Kuhl, Christopher A.

    2008-01-01

    The Aerial Regional-Scale Environmental Survey (ARES) is a Mars exploration mission concept that utilizes a rocket propelled airplane to take scientific measurements of atmospheric, surface, and subsurface phenomena. The liquid rocket propulsion system design has matured through several design cycles and trade studies since the inception of the ARES concept in 2002. This paper describes the process of selecting a bipropellant system over other propulsion system options, and provides details on the rocket system design, thrusters, propellant tank and PMD design, propellant isolation, and flow control hardware. The paper also summarizes computer model results of thruster plume interactions and simulated flight performance. The airplane has a 6.25 m wingspan with a total wet mass of 185 kg and has to ability to fly over 600 km through the atmosphere of Mars with 45 kg of MMH / MON3 propellant.

  2. Geological and seismological survey for new design-basis earthquake ground motion of Kashiwazaki-Kariwa NPS

    NASA Astrophysics Data System (ADS)

    Takao, M.; Mizutani, H.

    2009-05-01

    At about 10:13 on July 16, 2007, a strong earthquake named 'Niigata-ken Chuetsu-oki Earthquake' of Mj6.8 on Japan Meteorological Agencyfs scale occurred offshore Niigata prefecture in Japan. However, all of the nuclear reactors at Kashiwazaki-Kariwa Nuclear Power Station (KKNPS) in Niigata prefecture operated by Tokyo Electric Power Company shut down safely. In other words, automatic safety function composed of shutdown, cooling and containment worked as designed immediately after the earthquake. During the earthquake, the peak acceleration of the ground motion exceeded the design-basis ground motion (DBGM), but the force due to the earthquake applied to safety-significant facilities was about the same as or less than the design basis taken into account as static seismic force. In order to assess anew the safety of nuclear power plants, we have evaluated a new DBGM after conducting geomorphological, geological, geophysical, seismological survey and analyses. [Geomorphological, Geological and Geophysical survey] In the land area, aerial photograph interpretation was performed at least within the 30km radius to extract geographies that could possibly be tectonic reliefs as a geomorphological survey. After that, geological reconnaissance was conducted to confirm whether the extracted landforms are tectonic reliefs or not. Especially we carefully investigated Nagaoka Plain Western Boundary Fault Zone (NPWBFZ), which consists of Kakuda-Yahiko fault, Kihinomiya fault and Katakai fault, because NPWBFZ is the one of the active faults which have potential of Mj8 class in Japan. In addition to the geological survey, seismic reflection prospecting of approximate 120km in total length was completed to evaluate the geological structure of the faults and to assess the consecutiveness of the component faults of NPWBFZ. As a result of geomorphological, geological and geophysical surveys, we evaluated that the three component faults of NPWBFZ are independent to each other from the viewpoint of geological structure, however we have decided to take into consideration simultaneous movement of the three faults which is 91km long in seismic design as a case of uncertainty. In the sea area, we conducted seismic reflection prospecting with sonic wave in the area stretching for about 140km along the coastline and 50km in the direction of perpendicular to the coastline. When we analyze the seismic profiles, we evaluated the activities of faults and foldings carefully on the basis of the way of thinking of 'fault-related-fault' because the sedimentary layers in the offing of Niigata prefecture are very thick and the geological structures are characterized by foldings. As a result of the seismic reflection survey and analyses, we assess that five active faults (foldings) to be taken into consideration to seismic design in the sea area and we evaluated that the F-B fault of 36km will have the largest impact on the KKNPS. [Seismological survey] As a result of analyses of the geological survey, data from NCOE and data from 2004 Chuetsu Earthquake, it became clear that there are factors that intensifies seismic motions in this area. For each of the two selected earthquake sources, namely NPWBFZ and F-B fault, we calculated seismic ground motions on the free surface of the base stratum as the design-basis ground motion (DBGM) Ss, using both empirical and numerical ground motion evaluation method. PGA value of DBGM is 2,300Gal for unit 1 to 4 located in the southern part of the KKNPS and 1,050Gal for unit 5 to 7 in the northern part of the site.

  3. An integrated device for magnetically-driven drug release and in situ quantitative measurements: Design, fabrication and testing

    NASA Astrophysics Data System (ADS)

    Bruvera, I. J.; Hernández, R.; Mijangos, C.; Goya, G. F.

    2015-03-01

    We have developed a device capable of remote triggering and in situ quantification of therapeutic drugs, based on magnetically-responsive hydrogels of poly (N-isopropylacrylamide) and alginate (PNiPAAm). The heating efficiency of these hydrogels measured by their specific power absorption (SPA) values showed that the values between 100 and 300 W/g of the material were high enough to reach the lower critical solution temperature (LCST) of the polymeric matrix within few minutes. The drug release through application of AC magnetic fields could be controlled by time-modulated field pulses in order to deliver the desired amount of drug. Using B12 vitamin as a concept drug, the device was calibrated to measure amounts of drug released as small as 25(2)×10-9 g, demonstrating the potential of this device for very precise quantitative control of drug release.

  4. A novel quantitative immunoassay system for p53 using antibodies selected for optimum designation of p53 status.

    PubMed Central

    Thomas, M D; McIntosh, G G; Anderson, J J; McKenna, D M; Parr, A H; Johnstone, R; Lennard, T W; Horne, C H; Angus, B

    1997-01-01

    AIM: To develop a highly sensitive and specific enzyme linked immunosorbent assay (ELISA) system for analysis of p53 protein in cancer lysates. METHODS: The anti-p53 monoclonal antibodies DO7, 1801, BP53.12, and 421, and anti-p53 polyclonal antiserum CM1 were assessed by immunohistochemistry and western blot analysis to identify those most suitable for determining p53 status of cancer cells. Antibodies with desired characteristics were used to develop a non-competitive sandwich type ELISA system for analysis of p53 expression in cancer cytosols. Using the ELISA, p53 protein concentrations were measured in a small series of breast cancers, and the quantitative values compared with p53 immunohistochemical data of the same cancers. RESULTS: DO7 and 1801 gave the most specific and reliable results on immunohistochemistry and western blot analysis. Using these two antibodies, a non-competitive sandwich type ELISA system was developed to analyse p53 quantitatively. Analysis of the breast cancer series showed a good correlation between immunohistochemistry and the ELISA-tumours were generally positive using both techniques. Discrepancies were noted however: some cancers were immunohistochemically negative but ELISA positive. One explanation for this may be that the ELISA is more sensitive than immunohistochemistry. CONCLUSION: The p53 ELISA system is a non-competitive double monoclonal antibody sandwich method, using DO7 and 1801 which have been shown to be highly specific for p53 protein by immunohistochemistry and western blot analysis. The lower threshold of the assay is 0.1 ng/ml analyte in an enriched recombinant p53 preparation. As p53 is now regarded as a protein associated with prognosis in breast and other cancers, the assay may have clinical applications. Images PMID:9155696

  5. The FMOS-COSMOS survey of star-forming galaxies at z~1.6 III. Survey design, performance, and sample characteristics

    E-print Network

    Silverman, J D; Arimoto, N; Renzini, A; Rodighiero, G; Daddi, E; Sanders, D; Kartaltepe, J; Zahid, J; Nagao, T; Kewley, L J; Lilly, S J; Sugiyama, N; Capak, P; Carollo, C M; Chu, J; Hasinger, G; Ilbert, O; Kajisawa, M; Koekemoer, A M; Kovac, K; Fevre, O Le; Masters, D; McCracken, H J; Onodera, M; Scoville, N; Strazzullo, V; Taniguchi, Y

    2014-01-01

    We present a spectroscopic survey of galaxies in the COSMOS field using the Fiber Multi-Object Spectrograph (FMOS), a near-infrared instrument on the Subaru Telescope. Our survey is specifically designed to detect the Halpha emission line that falls within the H-band (1.6-1.8 micron) spectroscopic window from star-forming galaxies with M_stellar>10^10 Msolar and 1.4 < z < 1.7. With the high multiplex capabilities of FMOS, it is now feasible to construct samples of over one thousand galaxies having spectroscopic redshifts at epochs that were previously challenging. The high-resolution mode (R~2600) is implemented to effectively separate Halpha and [NII] emission lines thus enabling studies of gas-phase metallicity and photoionization conditions of the interstellar medium. The broad goals of our program are concerned with how star formation depends on stellar mass and environment, both recognized as drivers of galaxy evolution at lower redshifts. In addition to the main galaxy sample, our target selection...

  6. Essential Steps for Web Surveys: A Guide to Designing, Administering and Utilizing Web Surveys for University Decision-Making. Professional File. Number 102, Winter 2006

    ERIC Educational Resources Information Center

    Cheskis-Gold, Rena; Loescher, Ruth; Shepard-Rabadam, Elizabeth; Carroll, Barbara

    2006-01-01

    During the past few years, several Harvard paper surveys were converted to Web surveys. These were high-profile surveys endorsed by the Provost and the Dean of the College, and covered major portions of the university population (all undergraduates, all graduate students, tenured and non-tenured faculty). When planning for these surveys started in…

  7. German health interview and examination survey for adults (DEGS) - design, objectives and implementation of the first data collection wave

    PubMed Central

    2012-01-01

    Background The German Health Interview and Examination Survey for Adults (DEGS) is part of the recently established national health monitoring conducted by the Robert Koch Institute. DEGS combines a nationally representative periodic health survey and a longitudinal study based on follow-up of survey participants. Funding is provided by the German Ministry of Health and supplemented for specific research topics from other sources. Methods/design The first DEGS wave of data collection (DEGS1) extended from November 2008 to December 2011. Overall, 8152 men and women participated. Of these, 3959 persons already participated in the German National Health Interview and Examination Survey 1998 (GNHIES98) at which time they were 18–79?years of age. Another 4193 persons 18–79?years of age were recruited for DEGS1 in 2008–2011 based on two-stage stratified random sampling from local population registries. Health data and context variables were collected using standardized computer assisted personal interviews, self-administered questionnaires, and standardized measurements and tests. In order to keep survey results representative for the population aged 18–79?years, results will be weighted by survey-specific weighting factors considering sampling and drop-out probabilities as well as deviations between the design-weighted net sample and German population statistics 2010. Discussion DEGS aims to establish a nationally representative data base on health of adults in Germany. This health data platform will be used for continuous health reporting and health care research. The results will help to support health policy planning and evaluation. Repeated cross-sectional surveys will permit analyses of time trends in morbidity, functional capacity levels, disability, and health risks and resources. Follow-up of study participants will provide the opportunity to study trajectories of health and disability. A special focus lies on chronic diseases including asthma, allergies, cardiovascular conditions, diabetes mellitus, and musculoskeletal diseases. Other core topics include vaccine-preventable diseases and immunization status, nutritional deficiencies, health in older age, and the association between health-related behavior and mental health. PMID:22938722

  8. Chandra Multi-wavelength Plane (ChaMPlane) Survey: Design and Initial Results

    E-print Network

    Jonathan Grindlay; Ping Zhao; JaeSub Hong; Johnathan Jenkins; Dong-Woo Kim; Eric Schlegel; Jeremy Drake; Vinay Kashyap; Peter Edmonds; Haldan Cohn; Phyllis Lugger; Adrienne Cool

    2002-11-25

    The Chandra Multiwavength Plane (ChaMPlane) Survey of the galactic plane incorporates serendipitous sources from selected Chandra pointings in or near the galactic plane (b 20 ksec; lack of bright diffuse or point sources) to measure or constrain the luminosity function of low-luminosity accretion sources in the Galaxy. The primary goal is to detect and identify accreting white dwarfs (cataclysmic variables, with space density still uncertain by a factor of >10-100), neutron stars and black holes (quiescent low mass X-ray binaries) to constrain their space densities and thus origin and evolution. Secondary objectives are to identify Be stars in high mass X-ray binaries and constrain their space densities, and to survey the H-R diagram for stellar coronal sources. A parallel optical imaging under the NOAO Long Term Survey program provides deep optical images using the Mosaic imager on the CTIO and KPNO 4-m telescopes. The 36arcmin X 36arcmin optical images (Halpha, R, V and I) cover ~5X the area of each enclosed Chandra ACIS FOV, providing an extended survey of emission line objects for comparison with Chandra. Spectroscopic followup of optical counterparts is then conducted, thus far with WIYN and Magellan. The X-ray preliminary results from both the Chandra and optical surveys will be presented, including logN-logS vs. galactic position (l,b) and optical idenifications.

  9. Design and Specification of Optical Bandpass Filters for Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS)

    NASA Technical Reports Server (NTRS)

    Leviton, Douglas B.; Tsevetanov, Zlatan; Woodruff, Bob; Mooney, Thomas A.

    1998-01-01

    Advanced optical bandpass filters for the Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS) have been developed on a filter-by-filter basis through detailed studies which take into account the instrument's science goals, available optical filter fabrication technology, and developments in ACS's charge-coupled-device (CCD) detector technology. These filters include a subset of filters for the Sloan Digital Sky Survey (SDSS) which are optimized for astronomical photometry using today's charge-coupled-devices (CCD's). In order for ACS to be truly advanced, these filters must push the state-of-the-art in performance in a number of key areas at the same time. Important requirements for these filters include outstanding transmitted wavefront, high transmittance, uniform transmittance across each filter, spectrally structure-free bandpasses, exceptionally high out of band rejection, a high degree of parfocality, and immunity to environmental degradation. These constitute a very stringent set of requirements indeed, especially for filters which are up to 90 mm in diameter. The highly successful paradigm in which final specifications for flight filters were derived through interaction amongst the ACS Science Team, the instrument designer, the lead optical engineer, and the filter designer and vendor is described. Examples of iterative design trade studies carried out in the context of science needs and budgetary and schedule constraints are presented. An overview of the final design specifications for the ACS bandpass and ramp filters is also presented.

  10. Predictors of intentions to quit smoking in Aboriginal tobacco smokers of reproductive age in regional New South Wales (NSW), Australia: quantitative and qualitative findings of a cross-sectional survey

    PubMed Central

    Gould, Gillian Sandra; Watt, Kerrianne; McEwen, Andy; Cadet-James, Yvonne; Clough, Alan R

    2015-01-01

    Objectives To assess the predictors of intentions to quit smoking in a community sample of Aboriginal smokers of reproductive age, in whom smoking prevalence is slow to decline. Design, setting and participants A cross-sectional survey involved 121 Aboriginal smokers, aged 18–45?years from January to May 2014, interviewed at community events on the Mid-North Coast NSW. Qualitative and quantitative data were collected on smoking and quitting attitudes, behaviours and home smoking rules. Perceived efficacy for quitting, and perceived threat from smoking, were uniquely assessed with a validated Risk Behaviour Diagnosis (RBD) Scale. Main outcome measures Logistic regression explored the impact of perceived efficacy, perceived threat and consulting previously with a doctor or health professional (HP) on self-reported intentions to quit smoking, controlling for potential confounders, that is, protection responses and fear control responses, home smoking rules, gender and age. Participants’ comments regarding smoking and quitting were investigated via inductive analysis, with the assistance of Aboriginal researchers. Results Two-thirds of smokers intended to quit within 3?months. Perceived efficacy (OR=4.8; 95% CI 1.78 to 12.93) and consulting previously with a doctor/HP about quitting (OR=3.82; 95% CI 1.43 to 10.2) were significant predictors of intentions to quit. ‘Smoking is not doing harm right now’ was inversely associated with quit intentions (OR=0.25; 95% CI 0.08 to 0.8). Among those who reported making a quit attempt, after consulting with a doctor/HP, 40% (22/60) rated the professional support received as low (0–2/10). Qualitative themes were: the negatives of smoking (ie, disgust, regret, dependence and stigma), health effects and awareness, quitting, denial, ‘smoking helps me cope’ and social aspects of smoking. Conclusions Perceived efficacy and consulting with a doctor/HP about quitting may be important predictors of intentions to quit smoking in Aboriginal smokers of reproductive age. Professional support was generally perceived to be low; thus, it could be improved for these Aboriginal smokers. Aboriginal participants expressed strong sentiments about smoking and quitting. PMID:25770232

  11. The Outer Solar System Origins Survey: I. Design and First-Quarter Discoveries

    E-print Network

    Bannister, Michele T; Petit, Jean-Marc; Gladman, Brett J; Gwyn, Stephen D J; Chen, Ying-Tung; Volk, Kathryn; Alexandersen, Mike; Benecchi, Susan; Delsanti, Audrey; Fraser, Wesley; Granvik, Mikael; Grundy, Will M; Guilbert-Lepoutre, Aurelie; Hestroffer, Daniel; Ip, Wing-Huen; Jakubik, Marian; Jones, Lynne; Kaib, Nathan; Lacerda, Pedro; Lawler, Samantha; Lehner, Matthew J; Lin, Hsing Wen; Lister, Tim; Lykawka, Patryk Sofia; Monty, Stephanie; Marsset, Michael; Murray-Clay, Ruth; Noll, Keith; Parker, Alex; Pike, Rosemary E; Rousselot, Philippe; Rusk, David; Schwamb, Megan E; Shankman, Cory; Sicardy, Bruno; Vernazza, Pierre; Wang, Shiang-Yu

    2015-01-01

    We report 85 trans-Neptunian objects (TNOs) from the first 42 deg$^{2}$ of the Outer Solar System Origins Survey (OSSOS), an ongoing $r$-band survey with the 0.9 deg$^{2}$ field-of-view MegaPrime camera on the 3.6 m Canada-France-Hawaii Telescope. A dense observing cadence and our innovative astrometric technique produced survey-measured orbital elements for these TNOs precise to a fractional semi-major axis uncertainty $debiasing the discovery sample. We confirm the existence of a cold "kernel" of objects within the main cold classical Kuiper belt, and imply the existence of an extension of the "stirred" cold classical Kuiper belt to at least several AU beyond the 2:1 mean motion resonance with Neptune. The popula...

  12. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  13. Disposable surface plasmon resonance aptasensor with membrane-based sample handling design for quantitative interferon-gamma detection.

    PubMed

    Chuang, Tsung-Liang; Chang, Chia-Chen; Chu-Su, Yu; Wei, Shih-Chung; Zhao, Xi-hong; Hsueh, Po-Ren; Lin, Chii-Wann

    2014-08-21

    ELISA and ELISPOT methods are utilized for interferon-gamma (IFN-?) release assays (IGRAs) to detect the IFN-? secreted by T lymphocytes. However, the multi-step protocols of the assays are still performed with laboratory instruments and operated by well-trained people. Here, we report a membrane-based microfluidic device integrated with a surface plasmon resonance (SPR) sensor to realize an easy-to-use and cost effective multi-step quantitative analysis. To conduct the SPR measurements, we utilized a membrane-based SPR sensing device in which a rayon membrane was located 300 ?m under the absorbent pad. The basic equation covering this type of transport is based on Darcy's law. Furthermore, the concentration of streptavidin delivered from a sucrose-treated glass pad placed alongside the rayon membrane was controlled in a narrow range (0.81 ?M ± 6%). Finally, the unbound molecules were removed by a washing buffer that was pre-packed in the reservoir of the chip. Using a bi-functional, hairpin-shaped aptamer as the sensing probe, we specifically detected the IFN-? and amplified the signal by binding the streptavidin. A high correlation coefficient (R(2) = 0.995) was obtained, in the range from 0.01 to 100 nM. A detection limit of 10 pM was achieved within 30 min. Thus, the SPR assay protocols for IFN-? detection could be performed using this simple device without an additional pumping system. PMID:24931052

  14. Design of a detection survey for Ostreid herpesvirus-1 using hydrodynamic dispersion models to determine epidemiological units.

    PubMed

    Pande, Anjali; Acosta, Hernando; Brangenberg, Naya Alexis; Keeling, Suzanne Elizabeth

    2015-04-01

    Using Ostreid herpesvirus-1 (OsHV-1) as a case study, this paper considers a survey design methodology for an aquatic animal pathogen that incorporates the concept of biologically independent epidemiological units. Hydrodynamically-modelled epidemiological units are used to divide marine areas into sensible sampling units for detection surveys of waterborne diseases. In the aquatic environment it is difficult to manage disease at the animal level, hence management practices are often aimed at a group of animals sharing a similar risk. Using epidemiological units is a way to define these groups, based on a similar level of probability of exposure based on the modelled potential spread of a viral particle via coastal currents, that can help inform management decisions. PMID:25746929

  15. The Results of the National Heritage Language Survey: Implications for Teaching, Curriculum Design, and Professional Development

    ERIC Educational Resources Information Center

    Carreira, Maria; Kagan, Olga

    2011-01-01

    This article reports on a survey of heritage language learners (HLLs) across different heritage languages (HLs) and geographic regions in the United States. A general profile of HLLs emerges as a student who (1) acquired English in early childhood, after acquiring the HL; (2) has limited exposure to the HL outside the home; (3) has relatively…

  16. The TRacking Adolescents' Individual Lives Survey (TRAILS): Design, Current Status, and Selected Findings

    ERIC Educational Resources Information Center

    Ormel, Johan; Oldehinkel, Albertine J.; Sijtsema, Jelle; van Oort, Floor; Raven, Dennis; Veenstra, Rene; Vollebergh, Wilma A. M.; Verhulst, Frank C.

    2012-01-01

    Objectives: The objectives of this study were as follows: to present a concise overview of the sample, outcomes, determinants, non-response and attrition of the ongoing TRacking Adolescents' Individual Lives Survey (TRAILS), which started in 2001; to summarize a selection of recent findings on continuity, discontinuity, risk, and protective…

  17. NATIONAL SURVEY OF ADOLESCENT WELL-BEING (NSCAW): A COMPARISON OF MODEL AND DESIGN BASED ANALYSES

    E-print Network

    and mater- nal substance abuse on a child's cognitive stimulation scores for a subset of the children members of society. The Department of Health and Human Services sponsored the National Survey of Child with the child welfare system. This paper uses the NSCAW data to investigate the role of maternal depression

  18. Designing Messaging to Engage Patients in an Online Suicide Prevention Intervention: Survey Results From Patients With Current Suicidal Ideation

    PubMed Central

    Lungu, Anita; Richards, Julie; Simon, Gregory E; Clingan, Sarah; Siler, Jaeden; Snyder, Lorilei; Ludman, Evette

    2014-01-01

    Background Computerized, Internet-delivered interventions can be efficacious; however, uptake and maintaining sustained client engagement are still big challenges. We see the development of effective engagement strategies as the next frontier in online health interventions, an area where much creative research has begun. We also argue that for engagement strategies to accomplish their purpose with novel targeted populations, they need to be tailored to such populations (ie, content is designed with the target population in mind). User-centered design frameworks provide a theoretical foundation for increasing user engagement and uptake by including users in development. However, deciding how to implement this approach to enage users in mental health intervention development is challenging. Objective The aim of this study was to get user input and feedback on acceptability of messaging content intended to engage suicidal individuals. Methods In March 2013, clinic intake staff distributed flyers announcing the study, “Your Feedback Counts” to potential participants (individuals waiting to be seen for a mental health appointment) together with the Patient Health Questionnaire. The flyer explained that a score of two or three (“more than half the days” or “nearly every day” respectively) on the suicide ideation question made them eligible to provide feedback on components of a suicide prevention intervention under development. The patient could access an anonymous online survey by following a link. After providing consent online, participants completed the anonymous survey. Results Thirty-four individuals provided data on past demographic information. Participants reported that they would be most drawn to an intervention where they knew that they were cared about, that was personalized, that others like them had found it helpful, and that included examples with real people. Participants preferred email invitations with subject lines expressing concern and availability of extra resources. Participants also provided feedback about a media prototype including a brand design and advertisement video for introducing the intervention. Conclusions This paper provides one model (including development of an engagement survey, audience for an engagement survey, methods for presenting results of an engagement survey) for including target users in the development of uptake strategies for online mental health interventions. PMID:24509475

  19. Quantitative Analysis of Adulterations in Oat Flour by FT-NIR Spectroscopy, Incomplete Unbalanced Randomized Block Design, and Partial Least Squares

    PubMed Central

    Wang, Ning; Zhang, Xingxiang; Yu, Zhuo; Li, Guodong; Zhou, Bin

    2014-01-01

    This paper developed a rapid and nondestructive method for quantitative analysis of a cheaper adulterant (wheat flour) in oat flour by NIR spectroscopy and chemometrics. Reflectance FT-NIR spectra in the range of 4000 to 12000?cm?1 of 300 oat flour objects adulterated with wheat flour were measured. The doping levels of wheat flour ranged from 5% to 50% (w/w). To ensure the generalization performance of the method, both the oat and the wheat flour samples were collected from different producing areas and an incomplete unbalanced randomized block (IURB) design was performed to include the significant variations that may be encountered in future samples. Partial least squares regression (PLSR) was used to develop calibration models for predicting the levels of wheat flour. Different preprocessing methods including smoothing, taking second-order derivative (D2), and standard normal variate (SNV) transformation were investigated to improve the model accuracy of PLS. The root mean squared error of Monte Carlo cross-validation (RMSEMCCV) and root mean squared error of prediction (RMSEP) were 1.921 and 1.975 (%, w/w) by D2-PLS, respectively. The results indicate that NIR and chemometrics can provide a rapid method for quantitative analysis of wheat flour in oat flour. PMID:25143857

  20. GRAND DESIGN AND FLOCCULENT SPIRALS IN THE SPITZER SURVEY OF STELLAR STRUCTURE IN GALAXIES (S{sup 4}G)

    SciTech Connect

    Elmegreen, Debra Meloy; Yau, Andrew; Elmegreen, Bruce G.; Athanassoula, E.; Bosma, Albert; Helou, George; Sheth, Kartik; Ho, Luis C.; Madore, Barry F.; Menendez-Delmestre, KarIn; Gadotti, Dimitri A.; Knapen, Johan H.; Laurikainen, Eija; Salo, Heikki; Meidt, Sharon E.; Regan, Michael W.; Zaritsky, Dennis; Aravena, Manuel

    2011-08-10

    Spiral arm properties of 46 galaxies in the Spitzer Survey of Stellar Structure in Galaxies (S{sup 4}G) were measured at 3.6 {mu}m, where extinction is small and the old stars dominate. The sample includes flocculent, multiple arm, and grand design types with a wide range of Hubble and bar types. We find that most optically flocculent galaxies are also flocculent in the mid-IR because of star formation uncorrelated with stellar density waves, whereas multiple arm and grand design galaxies have underlying stellar waves. Arm-interarm contrasts increase from flocculent to multiple arm to grand design galaxies and with later Hubble types. Structure can be traced further out in the disk than in previous surveys. Some spirals peak at mid-radius while others continuously rise or fall, depending on Hubble and bar type. We find evidence for regular and symmetric modulations of the arm strength in NGC 4321. Bars tend to be long, high amplitude, and flat-profiled in early-type spirals, with arm contrasts that decrease with radius beyond the end of the bar, and they tend to be short, low amplitude, and exponential-profiled in late Hubble types, with arm contrasts that are constant or increase with radius. Longer bars tend to have larger amplitudes and stronger arms.

  1. A linkage strategy for detection of human quantitative-trait loci. II. Optimization of study designs based on extreme sib pairs and generalized relative risk ratios.

    PubMed Central

    Gu, C; Rao, D C

    1997-01-01

    We are concerned here with practical issues in the application of extreme sib-pair (ESP) methods to quantitative traits. Two important factors-namely, the way extreme trait values are defined and the proportions in which different types of ESPs are pooled, in the analysis-are shown to determine the power and the cost effectiveness of a study design. We found that, in general, combining reasonable numbers of both extremely discordant and extremely concordant sib pairs that were available in the sample is more powerful and more cost effective than pursuing only a single type of ESP. We also found that dividing trait values with a less extreme threshold at one end or at both ends of the trait distribution leads to more cost-effective designs. The notion of generalized relative risk ratios (the lambda methods, as described in the first part of this series of two articles) is used to calculate the power and sample size for various choices of polychotomization of trait values and for the combination of different types of ESPs. A balance then can be struck among these choices, to attain an optimum design. PMID:9246002

  2. Optical Design of the Camera for Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Chrisp, Michael; Clark, Kristin; Primeau, Brian; Dalpiaz, Michael; Lennon, Joseph

    2015-01-01

    The optical design of the wide field of view refractive camera, 34 degrees diagonal field, for the TESS payload is described. This fast f/1.4 cryogenic camera, operating at -75 C, has no vignetting for maximum light gathering within the size and weight constraints. Four of these cameras capture full frames of star images for photometric searches of planet crossings. The optical design evolution, from the initial Petzval design, took advantage of Forbes aspheres to develop a hybrid design form. This maximized the correction from the two aspherics resulting in a reduction of average spot size by sixty percent in the final design. An external long wavelength pass filter was replaced by an internal filter coating on a lens to save weight, and has been fabricated to meet the specifications. The stray light requirements were met by an extended lens hood baffle design, giving the necessary off-axis attenuation.

  3. The FMOS-COSMOS Survey of Star-forming Galaxies at z~1.6. III. Survey Design, Performance, and Sample Characteristics

    NASA Astrophysics Data System (ADS)

    Silverman, J. D.; Kashino, D.; Sanders, D.; Kartaltepe, J. S.; Arimoto, N.; Renzini, A.; Rodighiero, G.; Daddi, E.; Zahid, J.; Nagao, T.; Kewley, L. J.; Lilly, S. J.; Sugiyama, N.; Baronchelli, I.; Capak, P.; Carollo, C. M.; Chu, J.; Hasinger, G.; Ilbert, O.; Juneau, S.; Kajisawa, M.; Koekemoer, A. M.; Kovac, K.; Le Fèvre, O.; Masters, D.; McCracken, H. J.; Onodera, M.; Schulze, A.; Scoville, N.; Strazzullo, V.; Taniguchi, Y.

    2015-09-01

    We present a spectroscopic survey of galaxies in the COSMOS field using the Fiber Multi-object Spectrograph (FMOS), a near-infrared instrument on the Subaru Telescope. Our survey is specifically designed to detect the H? emission line that falls within the H-band (1.6-1.8 ?m) spectroscopic window from star-forming galaxies with 1.4 < z < 1.7 and Mstellar ? 1010 M?. With the high multiplex capability of FMOS, it is now feasible to construct samples of over 1000 galaxies having spectroscopic redshifts at epochs that were previously challenging. The high-resolution mode (R ˜ 2600) effectively separates H? and [N ii]?6585, thus enabling studies of the gas-phase metallicity and photoionization state of the interstellar medium. The primary aim of our program is to establish how star formation depends on stellar mass and environment, both recognized as drivers of galaxy evolution at lower redshifts. In addition to the main galaxy sample, our target selection places priority on those detected in the far-infrared by Herschel/PACS to assess the level of obscured star formation and investigate, in detail, outliers from the star formation rate (SFR)—stellar mass relation. Galaxies with H? detections are followed up with FMOS observations at shorter wavelengths using the J-long (1.11-1.35 ?m) grating to detect H? and [O iii]?5008 which provides an assessment of the extinction required to measure SFRs not hampered by dust, and an indication of embedded active galactic nuclei. With 460 redshifts measured from 1153 spectra, we assess the performance of the instrument with respect to achieving our goals, discuss inherent biases in the sample, and detail the emission-line properties. Our higher-level data products, including catalogs and spectra, are available to the community.

  4. PROBABILITY SURVEY DESIGN ALTERNATIVES FOR WATERSHED-BASED STREAM AND RIVER MONITORING PROGRAMS

    EPA Science Inventory

    National, state, and tribal nation monitoring programs are designed to address multiple objectives. One objective comes from Clean Water Act Section 305(b) and is to provide status and trend estimates of the number (or percent) of stream and river lengths that meet designated u...

  5. Some New Bases and Needs for Interior Design from Environmental Research. A Preliminary Survey.

    ERIC Educational Resources Information Center

    Kleeman, Walter, Jr.

    Research which can form new bases for interior design is being greatly accelerated. Investigations in psychology, anthropology, psychiatry, and biology, as well as interdisciplinary projects, turn up literally hundreds of studies, the results of which will vitally affect interior design. This body of research falls into two parts--(1) human…

  6. NATIONAL RESEARCH PROGRAM ON DESIGN-BASED/MODEL-ASSISTED SURVEY METHODOLOGY FOR AQUATIC RESOURCES

    EPA Science Inventory

    We expect to accomplish five major goals with the Program. The first is to extend design-based statistical methodology to cover the unique circumstances encountered in EMAP. The second is to make both existing and newly-developed model-assisted design-based statistical tools m...

  7. Aerodynamic aircraft design methods and their notable applications: Survey of the activity in Japan

    NASA Technical Reports Server (NTRS)

    Fujii, Kozo; Takanashi, Susumu

    1991-01-01

    An overview of aerodynamic aircraft design methods and their recent applications in Japan is presented. A design code which was developed at the National Aerospace Laboratory (NAL) and is in use now is discussed, hence, most of the examples are the result of the collaborative work between heavy industry and the National Aerospace Laboratory. A wide variety of applications in transonic to supersonic flow regimes are presented. Although design of aircraft elements for external flows are the main focus, some of the internal flow applications are also presented. Recent applications of the design code, using the Navier Stokes and Euler equations in the analysis mode, include the design of HOPE (a space vehicle) and Upper Surface Blowing (USB) aircraft configurations.

  8. Survey and analysis of research on supersonic drag-due-to-lift minimization with recommendations for wing design

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Mann, Michael J.

    1992-01-01

    A survey of research on drag-due-to-lift minimization at supersonic speeds, including a study of the effectiveness of current design and analysis methods was conducted. The results show that a linearized theory analysis with estimated attainable thrust and vortex force effects can predict with reasonable accuracy the lifting efficiency of flat wings. Significantly better wing performance can be achieved through the use of twist and camber. Although linearized theory methods tend to overestimate the amount of twist and camber required for a given application and provide an overly optimistic performance prediction, these deficiencies can be overcome by implementation of recently developed empirical corrections. Numerous examples of the correlation of experiment and theory are presented to demonstrate the applicability and limitations of linearized theory methods with and without empirical corrections. The use of an Euler code for the estimation of aerodynamic characteristics of a twisted and cambered wing and its application to design by iteration are discussed.

  9. A Survey of Applications and Research in Integrated Design Systems Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The initial part of the study was begun with a combination of literature searches, World Wide Web searches, and contacts with individuals and companies who were known to members of our team to have an interest in topics that seemed to be related to our study. There is a long list of such topics, such as concurrent engineering, design for manufacture, life-cycle engineering, systems engineering, systems integration, systems design, design systems, integrated product and process approaches, enterprise integration, integrated product realization, and similar terms. These all capture, at least in part, the flavor of what we describe here as integrated design systems. An inhibiting factor in this inquiry was the absence of agreed terminology for the study of integrated design systems. It is common for the term to be applied to what are essentially augmented Computer-Aided Design (CAD) systems, which are integrated only to the extent that agreements have been reached to attach proprietary extensions to proprietary CAD programs. It is also common for some to use the term integrated design systems to mean a system that applies only, or mainly, to the design phase of a product life cycle. It is likewise common for many of the terms listed earlier to be used as synonyms for integrated design systems. We tried to avoid this ambiguity by adopting the definition of integrated design systems that is implied in the introductory notes that we provided to our contacts, cited earlier. We thus arrived at this definition: Integrated Design Systems refers to the integration of the different tools and processes that comprise the engineering, of complex systems. It takes a broad view of the engineering of systems, to include consideration of the entire product realization process and the product life cycle. An important aspect of integrated design systems is the extent to which they integrate existing, "islands of automation" into a comprehensive design and product realization environment. As the study progressed, we relied increasingly upon a networking approach to lead us to new information. The departure point for such searches often was a government-sponsored project or a company initiative. The advantage of this approach was that short conversations with knowledgeable persons would usually cut through confusion over differences of terminology, thereby somewhat reducing the search space of the study. Even so, it was not until late in our eight-month inquiry that we began to see signs of convergence of the search, in the sense that a number of the latest inquiries began to turn up references to earlier contacts. As suggested above, this convergence often occurred with respect to particular government or company projects.

  10. Advanced power generation systems for the 21st Century: Market survey and recommendations for a design philosophy

    SciTech Connect

    Andriulli, J.B.; Gates, A.E.; Haynes, H.D.; Klett, L.B.; Matthews, S.N.; Nawrocki, E.A.; Otaduy, P.J.; Scudiere, M.B.; Theiss, T.J.; Thomas, J.F.; Tolbert, L.M.; Yauss, M.L.; Voltz, C.A.

    1999-11-01

    The purpose of this report is to document the results of a study designed to enhance the performance of future military generator sets (gen-sets) in the medium power range. The study includes a market survey of the state of the art in several key component areas and recommendations comprising a design philosophy for future military gen-sets. The market survey revealed that the commercial market is in a state of flux, but it is currently or will soon be capable of providing the technologies recommended here in a cost-effective manner. The recommendations, if implemented, should result in future power generation systems that are much more functional than today's gen-sets. The number of differing units necessary (both family sizes and frequency modes) to cover the medium power range would be decreased significantly, while the weight and volume of each unit would decrease, improving the transportability of the power source. Improved fuel economy and overall performance would result from more effective utilization of the prime mover in the generator. The units would allow for more flexibility and control, improved reliability, and more effective power management in the field.

  11. Long-term validity and reliability of a patient survey instrument designed for general dental practice.

    PubMed

    Busby, M; Matthews, R; Burke, F J T; Mullins, A; Schumaker, K

    2015-10-01

    Aim To consider the extent to which the validity and reliability of the Denplan Excel Patient Survey (DEPS) has been confirmed during its development and by its use in general dental practice and to explore methods by which any survey instrument used in general dental practice might be validated and tested for reliability.Methods DEPS seeks to measure perceived practice performance on those issues considered to be of greatest importance to patients. Content validity was developed by a literature review and tested in a pilot study. Criterion validity was tested by comparing patient retention in a payment plan for practices achieving the highest DEPS scores with those attaining the lowest scores over a two year period (surveys completed between 2010 and 2012). Reliability was assessed using the test/re-test method for 23 practices with approximately a three year time interval between tests. Internal consistency was tested by comparing Net Promoter Scores (NPS - which is measured in DEPS) attained by practices with their Patient Perception Index (PPI) as measured by the ten core questions in DEPS.Results Practices in the pilot study strongly endorsed the content validity of DEPS. The 12 practices with the highest scores in the DEPS slightly increased their number of patients registered in Denplan payment plans during a two year period. The 12 lowest scoring practices saw 7% of their patients de-register during the same period. The 23 practices selected for the test/re-test study averaged more than 250 responses for both the test and re-test phases. The magnitude and pattern of their results were similar in both phases, while, on average, a modest improvement in results was observed. Internal consistency was confirmed as NPS results in DEPS closely mapped PPI results. The higher the measurement of perceived quality (PPI) the more likely patients were to recommend the practice (NPS).Conclusion Both through its development and use over the last four years The DEPS has demonstrated good validity and reliability. The authors conclude that this level of validity and reliability is adequate for the clinical/general care audit purpose of DEPS and that it is therefore likely to reliably inform practices where further development are indicated. It is important and quite straightforward to both validate and check the reliability of patient surveys used in general dental practice so that dental teams can be confident in the instrument they are using. PMID:26450250

  12. A Study of Program Management Procedures in the Campus-Based and Basic Grant Programs. Technical Report No. 1: Sample Design, Student Survey Yield and Bias.

    ERIC Educational Resources Information Center

    Puma, Michael J.; Ellis, Richard

    Part of a study of program management procedures in the campus-based and Basic Educational Opportunity Grant programs reports on the design of the site visit component of the study and the results of the student survey, both in terms of the yield obtained and the quality of the data. Chapter 2 describes the design of sampling methodology employed…

  13. Surveying the Commons: Current Implementation of Information Commons Web sites

    ERIC Educational Resources Information Center

    Leeder, Christopher

    2009-01-01

    This study assessed the content of 72 academic library Information Commons (IC) Web sites using content analysis, quantitative assessment and qualitative surveys of site administrators to analyze current implementation by the academic library community. Results show that IC Web sites vary widely in content, design and functionality, with few…

  14. The Mississippi Delta Cardiovascular Health Examination Survey: Study Design and Methods

    PubMed Central

    Short, Vanessa L.; Ivory-Walls, Tameka; Smith, Larry; Loustalot, Fleetwood

    2015-01-01

    Assessment of cardiovascular disease (CVD) morbidity and mortality in subnational areas is limited. A model for regional CVD surveillance is needed, particularly among vulnerable populations underrepresented in current monitoring systems. The Mississippi Delta Cardiovascular Health Examination Survey (CHES) is a population-based, cross-sectional study on a representative sample of adults living in the 18-county Mississippi Delta region, a rural, impoverished area with high rates of poor health outcomes and marked health disparities. The primary objectives of Delta CHES are to (1) determine the prevalence and distribution of CVD and CVD risk factors using self-reported and directly measured health metrics and (2) to assess environmental perceptions and existing policies that support or deter healthy choices. An address-based sampling frame is used for household enumeration and participant recruitment and an in-home data collection model is used to collect survey data, anthropometric measures, and blood samples from participants. Data from all sources will be merged into one analytic dataset and sample weights developed to ensure data are representative of the Mississippi Delta region adult population. Information gathered will be used to assess the burden of CVD and guide the development, implementation, and evaluation of cardiovascular health promotion and risk factor control strategies. PMID:25844257

  15. Ultradeep IRAC Imaging Over the HUDF and GOODS-South: Survey Design and Imaging Data Release

    NASA Astrophysics Data System (ADS)

    Labbé, I.; Oesch, P. A.; Illingworth, G. D.; van Dokkum, P. G.; Bouwens, R. J.; Franx, M.; Carollo, C. M.; Trenti, M.; Holden, B.; Smit, R.; González, V.; Magee, D.; Stiavelli, M.; Stefanon, M.

    2015-12-01

    The IRAC ultradeep field and IRAC Legacy over GOODS programs are two ultradeep imaging surveys at 3.6 and 4.5 ?m with the Spitzer Infrared Array Camera (IRAC). The primary aim is to directly detect the infrared light of reionization epoch galaxies at z > 7 and to constrain their stellar populations. The observations cover the Hubble Ultra Deep Field (HUDF), including the two HUDF parallel fields, and the CANDELS/GOODS-South, and are combined with archival data from all previous deep programs into one ultradeep data set. The resulting imaging reaches unprecedented coverage in IRAC 3.6 and 4.5 ?m ranging from >50 hr over 150 arcmin2, >100 hr over 60 sq arcmin2, to ?200 hr over 5–10 arcmin2. This paper presents the survey description, data reduction, and public release of reduced mosaics on the same astrometric system as the CANDELS/GOODS-South Wide Field Camera 3 (WFC3) data. To facilitate prior-based WFC3+IRAC photometry, we introduce a new method to create high signal-to-noise PSFs from the IRAC data and reconstruct the complex spatial variation due to survey geometry. The PSF maps are included in the release, as are registered maps of subsets of the data to enable reliability and variability studies. Simulations show that the noise in the ultradeep IRAC images decreases approximately as the square root of integration time over the range 20–200 hr, well below the classical confusion limit, reaching 1? point-source sensitivities as faint as 15 nJy (28.5 AB) at 3.6 ?m and 18 nJy (28.3 AB) at 4.5 ?m. The value of such ultradeep IRAC data is illustrated by direct detections of z = 7–8 galaxies as faint as HAB = 28. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained from the data archive at the Space Telescope Science Institute. STScI is operated by the Association of Universities for Research in Astronomy, Inc. under NASA contract NAS 5-26555. Based on observations made with the Spitzer Space Telescope, which is operated by the Jet Propulsion Laboratory, California Institute of Technology under a contract with NASA. Support for this work was provided by NASA through an award issued by JPL/Caltech.

  16. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    USGS Publications Warehouse

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are becoming increasingly popular for group size modeling. Choosing appropriate statistical distributions for modeling flock size data is fundamental to accurately estimating population summaries, determining required survey effort, and assessing and propagating uncertainty through decision-making processes.

  17. DESIGN AND INDICATOR CONSIDERATIONS FOR A PROBABILISTIC SURVEY OF USA GREAT RIVERS: MISSOURI, MISSISSIPPI, OHIO

    EPA Science Inventory

    Great River Ecosystems (GRE) include the river channel and associated backwaters and floodplain habitats. The challenge in designing a GRE monitoring and assessment program is to choose a set of habitats, indicators, and sampling locations that reveal the ecological condition of ...

  18. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  19. On Quantitizing

    ERIC Educational Resources Information Center

    Sandelowski, Margarete; Voils, Corrine I.; Knafl, George

    2009-01-01

    "Quantitizing", commonly understood to refer to the numerical translation, transformation, or conversion of qualitative data, has become a staple of mixed methods research. Typically glossed are the foundational assumptions, judgments, and compromises involved in converting disparate data sets into each other and whether such conversions advance…

  20. The Norwegian Offender Mental Health and Addiction Study – Design and Implementation of a National Survey and Prospective Cohort Study

    PubMed Central

    Bukten, Anne; Lund, Ingunn Olea; Rognli, Eline Borger; Stavseth, Marianne Riksheim; Lobmaier, Philipp; Skurtveit, Svetlana; Clausen, Thomas; Kunøe, Nikolaj

    2015-01-01

    The Norwegian prison inmates are burdened by problems before they enter prison. Few studies have managed to assess this burden and relate it to what occurs for the inmates once they leave the prison. The Norwegian Offender Mental Health and Addiction (NorMA) study is a large-scale longitudinal cohort study that combines national survey and registry data in order to understand mental health, substance use, and criminal activity before, during, and after custody among prisoners in Norway. The main goal of the study is to describe the criminal and health-related trajectories based on both survey and registry linkage information. Data were collected from 1,499 inmates in Norwegian prison facilities during 2013–2014. Of these, 741 inmates provided a valid personal identification number and constitute a cohort that will be examined retrospectively and prospectively, along with data from nationwide Norwegian registries. This study describes the design, procedures, and implementation of the ongoing NorMA study and provides an outline of the initial data. PMID:26648732

  1. The C-Band All-Sky Survey (C-BASS): design and implementation of the northern receiver

    NASA Astrophysics Data System (ADS)

    King, O. G.; Jones, Michael E.; Blackhurst, E. J.; Copley, C.; Davis, R. J.; Dickinson, C.; Holler, C. M.; Irfan, M. O.; John, J. J.; Leahy, J. P.; Leech, J.; Muchovej, S. J. C.; Pearson, T. J.; Stevenson, M. A.; Taylor, Angela C.

    2014-03-01

    The C-Band All-Sky Survey is a project to map the full sky in total intensity and linear polarization at 5 GHz. The northern component of the survey uses a broad-band single-frequency analogue receiver fitted to a 6.1-m telescope at the Owens Valley Radio Observatory in California, USA. The receiver architecture combines a continuous-comparison radiometer and a correlation polarimeter in a single receiver for stable simultaneous measurement of both total intensity and linear polarization, using custom-designed analogue receiver components. The continuous-comparison radiometer measures the temperature difference between the sky and temperature-stabilized cold electrical reference loads. A cryogenic front-end is used to minimize receiver noise, with a system temperature of ?30 K in both linear polarization and total intensity. Custom cryogenic notch filters are used to counteract man-made radio frequency interference. The radiometer 1/f noise is dominated by atmospheric fluctuations, while the polarimeter achieves a 1/f noise knee frequency of 10 mHz, similar to the telescope azimuthal scan frequency.

  2. A survey on the design of multiprocessing systems for artificial intelligence applications

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Li, Guo Jie

    1989-01-01

    Some issues in designing computers for artificial intelligence (AI) processing are discussed. These issues are divided into three levels: the representation level, the control level, and the processor level. The representation level deals with the knowledge and methods used to solve the problem and the means to represent it. The control level is concerned with the detection of dependencies and parallelism in the algorithmic and program representations of the problem, and with the synchronization and sheduling of concurrent tasks. The processor level addresses the hardware and architectural components needed to evaluate the algorithmic and program representations. Solutions for the problems of each level are illustrated by a number of representative systems. Design decisions in existing projects on AI computers are classed into top-down, bottom-up, and middle-out approaches.

  3. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    Flowfield rake was designed to quantify the flowfield for inlet research underneath NASA DFRC s F-15B airplane. Detailed loads and stress analysis performed using CFD and empirical methods to assure structural integrity. Calibration data were generated through wind tunnel testing of the rake. Calibration algorithm was developed to determine the local Mach and flow angularity at each probe. RAGE was flown November, 2008. Data is currently being analyzed.

  4. Design of a Mars Airplane Propulsion System for the Aerial Regional-Scale Environmental Survey (ARES) Mission Concept

    NASA Technical Reports Server (NTRS)

    Kuhl. Christopher A.

    2009-01-01

    The Aerial Regional-Scale Environmental Survey (ARES) is a Mars exploration mission concept with the goal of taking scientific measurements of the atmosphere, surface, and subsurface of Mars by using an airplane as the payload platform. ARES team first conducted a Phase-A study for a 2007 launch opportunity, which was completed in May 2003. Following this study, significant efforts were undertaken to reduce the risk of the atmospheric flight system, under the NASA Langley Planetary Airplane Risk Reduction Project. The concept was then proposed to the Mars Scout program in 2006 for a 2011 launch opportunity. This paper summarizes the design and development of the ARES airplane propulsion subsystem beginning with the inception of the ARES project in 2002 through the submittal of the Mars Scout proposal in July 2006.

  5. Nonmedical influences on medical decision making: an experimental technique using videotapes, factorial design, and survey sampling.

    PubMed Central

    Feldman, H A; McKinlay, J B; Potter, D A; Freund, K M; Burns, R B; Moskowitz, M A; Kasten, L E

    1997-01-01

    OBJECTIVE: To study nonmedical influences on the doctor-patient interaction. A technique using simulated patients and "real" doctors is described. DATA SOURCES: A random sample of physicians, stratified on such characteristics as demographics, specialty, or experience, and selected from commercial and professional listings. STUDY DESIGN: A medical appointment is depicted on videotape by professional actors. The patient's presenting complaint (e.g., chest pain) allows a range of valid interpretation. Several alternative versions are taped, featuring the same script with patient-actors of different age, sex, race, or other characteristics. Fractional factorial design is used to select a balanced subset of patient characteristics, reducing costs without biasing the outcome. DATA COLLECTION: Each physician is shown one version of the videotape appointment and is asked to describe how he or she would diagnose or treat such a patient. PRINCIPAL FINDINGS: Two studies using this technique have been completed to date, one involving chest pain and dyspnea and the other involving breast cancer. The factorial design provided sufficient power, despite limited sample size, to demonstrate with statistical significance various influences of the experimental and stratification variables, including the patient's gender and age and the physician's experience. Persistent recruitment produced a high response rate, minimizing selection bias and enhancing validity. CONCLUSION: These techniques permit us to determine, with a degree of control unattainable in observational studies, whether medical decisions as described by actual physicians and drawn from a demographic or professional group of interest, are influenced by a prescribed set of nonmedical factors. PMID:9240285

  6. News Note: Quantitative Imaging Program

    Cancer.gov

    NCI is launching a new program to qualify existing NCI designated Cancer Centers with an added attribute -- as Centers of Quantitative Imaging Excellence. This program will significantly decrease potential variability in image procedures done while a pati

  7. Quantitative analysis of masculinity perceptions 

    E-print Network

    Cima, Brian Norman

    2002-01-01

    This study was conducted using a modified version of the Brannon's Masculinity index, designed to measure quantitatively the ideologies of American masculinity. Included is an extensive literature review and an explanation ...

  8. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    The Propulsion Flight Test Fixture at the NASA Dryden Flight Research Center is a unique test platform available for use on NASA's F-15B aircraft, tail number 836, as a modular host for a variety of aerodynamics and propulsion research. For future flight data from this platform to be valid, more information must be gathered concerning the quality of the airflow underneath the body of the F-15B at various flight conditions, especially supersonic conditions. The flow angularity and Mach number must be known at multiple locations on any test article interface plane for measurement data at these locations to be valid. To determine this prerequisite information, flight data will be gathered in the Rake Airflow Gauge Experiment using a custom-designed flowfield rake to probe the airflow underneath the F-15B at the desired flight conditions. This paper addresses the design considerations of the rake and probe assembly, including the loads and stress analysis using analytical methods, computational fluid dynamics, and finite element analysis. It also details the flow calibration procedure, including the completed wind-tunnel test and posttest data reduction, calibration verification, and preparation for flight-testing.

  9. Designing for Dissemination Among Public Health Researchers: Findings From a National Survey in the United States

    PubMed Central

    Jacobs, Julie A.; Tabak, Rachel G.; Hoehner, Christine M.; Stamatakis, Katherine A.

    2013-01-01

    Objectives. We have described the practice of designing for dissemination among researchers in the United States with the intent of identifying gaps and areas for improvement. Methods. In 2012, we conducted a cross-sectional study of 266 researchers using a search of the top 12 public health journals in PubMed and lists available from government-sponsored research. The sample involved scientists at universities, the National Institutes of Health, and the Centers for Disease Control and Prevention in the United States. Results. In the pooled sample, 73% of respondents estimated they spent less than 10% of their time on dissemination. About half of respondents (53%) had a person or team in their unit dedicated to dissemination. Seventeen percent of all respondents used a framework or theory to plan their dissemination activities. One third of respondents (34%) always or usually involved stakeholders in the research process. Conclusions. The current data and the existing literature suggest considerable room for improvement in designing for dissemination. PMID:23865659

  10. Information Presentation Features and Comprehensibility of Hospital Report Cards: Design Analysis and Online Survey Among Users

    PubMed Central

    2015-01-01

    Background Improving the transparency of information about the quality of health care providers is one way to improve health care quality. It is assumed that Internet information steers patients toward better-performing health care providers and will motivate providers to improve quality. However, the effect of public reporting on hospital quality is still small. One of the reasons is that users find it difficult to understand the formats in which information is presented. Objective We analyzed the presentation of risk-adjusted mortality rate (RAMR) for coronary angiography in the 10 most commonly used German public report cards to analyze the impact of information presentation features on their comprehensibility. We wanted to determine which information presentation features were utilized, were preferred by users, led to better comprehension, and had similar effects to those reported in evidence-based recommendations described in the literature. Methods The study consisted of 5 steps: (1) identification of best-practice evidence about the presentation of information on hospital report cards; (2) selection of a single risk-adjusted quality indicator; (3) selection of a sample of designs adopted by German public report cards; (4) identification of the information presentation elements used in public reporting initiatives in Germany; and (5) an online panel completed an online questionnaire that was conducted to determine if respondents were able to identify the hospital with the lowest RAMR and if respondents’ hospital choices were associated with particular information design elements. Results Evidence-based recommendations were made relating to the following information presentation features relevant to report cards: evaluative table with symbols, tables without symbols, bar charts, bar charts without symbols, bar charts with symbols, symbols, evaluative word labels, highlighting, order of providers, high values to indicate good performance, explicit statements of whether high or low values indicate good performance, and incomplete data (“N/A” as a value). When investigating the RAMR in a sample of 10 hospitals’ report cards, 7 of these information presentation features were identified. Of these, 5 information presentation features improved comprehensibility in a manner reported previously in literature. Conclusions To our knowledge, this is the first study to systematically analyze the most commonly used public reporting card designs used in Germany. Best-practice evidence identified in international literature was in agreement with 5 findings about German report card designs: (1) avoid tables without symbols, (2) include bar charts with symbols, (3) state explicitly whether high or low values indicate good performance or provide a “good quality” range, (4) avoid incomplete data (N/A given as a value), and (5) rank hospitals by performance. However, these findings are preliminary and should be subject of further evaluation. The implementation of 4 of these recommendations should not present insurmountable obstacles. However, ranking hospitals by performance may present substantial difficulties. PMID:25782186

  11. quantitative ecology & resource management QERM An Interdisciplinary Graduate Program

    E-print Network

    Ferguson, Thomas S.

    quantitative ecology & resource management QERM An Interdisciplinary Graduate Program UNIVERSITY OF WASHINGTON What is QERM? Quantitative Ecology & Resource Management (QERM) is a unique inter- disciplinary graduate program designed for students interested in applying quantitative tools to ecological and resource

  12. Flow field survey near the rotational plane of an advanced design propeller on a JetStar airplane

    NASA Technical Reports Server (NTRS)

    Walsh, K. R.

    1985-01-01

    An investigation was conducted to obtain upper fuselage surface static pressures and boundary layer velocity profiles below the centerline of an advanced design propeller. This investigation documents the upper fuselage velocity flow field in support of the in-flight acoustic tests conducted on a JetStar airplane. Initial results of the boundary layer survey show evidence of an unusual flow disturbance, which is attributed to the two windshield wiper assemblies on the aircraft. The assemblies were removed, eliminating the disturbances from the flow field. This report presents boundary layer velocity profiles at altitudes of 6096 and 9144 m (20,000 and 30,000 ft) and Mach numbers from 0.6 to 0.8, and it investigated the effects of windshield wiper assemblies on these profiles. Because of the unconventional velocity profiles that were obtained with the assemblies mounted, classical boundary layer parameters, such as momentum and displacement thicknesses, are not presented. The effects of flight test variables (Mach number and angles of attack and sideslip) and an advanced design propeller on boundary layer profiles - with the wiper assemblies mounted and removed - are presented.

  13. Expert systems as design aids for artificial vision systems: a survey

    NASA Astrophysics Data System (ADS)

    Crevier, Daniel

    1993-08-01

    The development of software that would be to computer vision what expert system shells are to expert systems has been the subject of considerable inquiry over the last ten years; this paper reviews the pertinent publications and tries to present a coherent view of the field. We start by outlining two major differences between would be `vision shells' and conventional expert system shells. The first is the need for an intermediate level of symbolic representation between image pixels and the knowledge base. The second is that the mental operations that people perform to interpret images lie almost totally below the threshold of consciousness. Vision system designers therefore cannot, as domain experts normally do, examine their own mental processes and cast them into rules to extract information from images. The vision shell should thus contain, in addition to the usual knowledge engineering toolbox, knowledge on the pertinence of specific imaging operations towards various goals. After a review of the role of explicit knowledge in artificial vision, we examine the architecture a vision shell should have, and look at ways of facilitating the entry of domain-pertinent knowledge into it. Final remarks are made on knowledge representation and acquisition aspects particular to industrial applications.

  14. 78 FR 5459 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ..., endoscopies) performed in HOSDs and ASCs. These surveys, survey questions, and measures should measure and... older) who recently have had surgery or other procedures, such as a colonoscopy or endoscopy, in...

  15. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...participation in a regional survey of marine and anadromous recreational...regional survey must: (1) Include all of the states within each region as follows...Atlantic coast); (ii) Florida (Gulf of Mexico coast), Alabama, Mississippi,...

  16. THE VIRUS-P EXPLORATION OF NEARBY GALAXIES (VENGA): SURVEY DESIGN, DATA PROCESSING, AND SPECTRAL ANALYSIS METHODS

    SciTech Connect

    Blanc, Guillermo A.; Weinzirl, Tim; Song, Mimi; Heiderman, Amanda; Gebhardt, Karl; Jogee, Shardha; Evans, Neal J. II; Kaplan, Kyle; Marinova, Irina; Vutisalchavakul, Nalin; Van den Bosch, Remco C. E.; Luo Rongxin; Hao Lei; Drory, Niv; Fabricius, Maximilian; Fisher, David; Yoachim, Peter

    2013-05-15

    We present the survey design, data reduction, and spectral fitting pipeline for the VIRUS-P Exploration of Nearby Galaxies (VENGA). VENGA is an integral field spectroscopic survey, which maps the disks of 30 nearby spiral galaxies. Targets span a wide range in Hubble type, star formation activity, morphology, and inclination. The VENGA data cubes have 5.''6 FWHM spatial resolution, {approx}5 A FWHM spectral resolution, sample the 3600 A-6800 A range, and cover large areas typically sampling galaxies out to {approx}0.7R{sub 25}. These data cubes can be used to produce two-dimensional maps of the star formation rate, dust extinction, electron density, stellar population parameters, the kinematics and chemical abundances of both stars and ionized gas, and other physical quantities derived from the fitting of the stellar spectrum and the measurement of nebular emission lines. To exemplify our methods and the quality of the data, we present the VENGA data cube on the face-on Sc galaxy NGC 628 (a.k.a. M 74). The VENGA observations of NGC 628 are described, as well as the construction of the data cube, our spectral fitting method, and the fitting of the stellar and ionized gas velocity fields. We also propose a new method to measure the inclination of nearly face-on systems based on the matching of the stellar and gas rotation curves using asymmetric drift corrections. VENGA will measure relevant physical parameters across different environments within these galaxies, allowing a series of studies on star formation, structure assembly, stellar populations, chemical evolution, galactic feedback, nuclear activity, and the properties of the interstellar medium in massive disk galaxies.

  17. National Aquatic Resource Surveys: Use of Geospatial data in their design and spatial prediction at non-monitored locations

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are four surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams, estuaries and intracoa...

  18. Application of Screening Experimental Designs to Assess Chromatographic Isotope Effect upon Isotope-Coded Derivatization for Quantitative Liquid Chromatography–Mass Spectrometry

    PubMed Central

    2015-01-01

    Isotope effect may cause partial chromatographic separation of labeled (heavy) and unlabeled (light) isotopologue pairs. Together with a simultaneous matrix effect, this could lead to unacceptable accuracy in quantitative liquid chromatography–mass spectrometry assays, especially when electrospray ionization is used. Four biologically relevant reactive aldehydes (acrolein, malondialdehyde, 4-hydroxy-2-nonenal, and 4-oxo-2-nonenal) were derivatized with light or heavy (d3-, 13C6-, 15N2-, or 15N4-labeled) 2,4-dinitrophenylhydrazine and used as model compounds to evaluate chromatographic isotope effects. For comprehensive assessment of retention time differences between light/heavy pairs under various gradient reversed-phase liquid chromatography conditions, major chromatographic parameters (stationary phase, mobile phase pH, temperature, organic solvent, and gradient slope) and different isotope labelings were addressed by multiple-factor screening using experimental designs that included both asymmetrical (Addelman) and Plackett–Burman schemes followed by statistical evaluations. Results confirmed that the most effective approach to avoid chromatographic isotope effect is the use of 15N or 13C labeling instead of deuterium labeling, while chromatographic parameters had no general influence. Comparison of the alternate isotope-coded derivatization assay (AIDA) using deuterium versus 15N labeling gave unacceptable differences (>15%) upon quantifying some of the model aldehydes from biological matrixes. On the basis of our results, we recommend the modification of the AIDA protocol by replacing d3-2,4-dinitrophenylhydrazine with 15N- or 13C-labeled derivatizing reagent to avoid possible unfavorable consequences of chromatographic isotope effects. PMID:24922593

  19. Inverse Problems 16 (2000) 10971117. Printed in the UK PII: S0266-5611(00)13703-7 Design strategies for electromagnetic geophysical surveys

    E-print Network

    2000-01-01

    geophysical data. The geophysical literature contains numerous advanced inversion methods (see, eInverse Problems 16 (2000) 1097­1117. Printed in the UK PII: S0266-5611(00)13703-7 Design strategies for electromagnetic geophysical surveys Hansruedi Maurer , David E Boerner and Andrew Curtis

  20. Bovine tuberculosis infection in wild mammals in the South-West region of England: a survey of prevalence and a semi-quantitative assessment of the relative risks to cattle.

    PubMed

    Delahay, R J; Smith, G C; Barlow, A M; Walker, N; Harris, A; Clifton-Hadley, R S; Cheeseman, C L

    2007-03-01

    In the United Kingdom, badgers are implicated in the transmission of Mycobacterium bovis to cattle, but little information is available on the potential role of other wild mammals. This paper presents the results of the largest systematic UK survey of M. bovis infection in other wild mammals. Mammal carcasses (4715) from throughout the South-West region of England were subjected to a systematic post mortem examination, microbiological culture of tissues and spoligotyping of isolates. Infection was confirmed in fox, stoat, polecat, common shrew, yellow-necked mouse, wood mouse, field vole, grey squirrel, roe deer, red deer, fallow deer and muntjac. Prevalence in deer may have been underestimated because the majority were incomplete carcasses, which reduced the likelihood of detecting infection. Infected cases were found in Wiltshire, Somerset, Devon and Cornwall, Gloucestershire and Herefordshire. Lesions were found in a high proportion of spoligotype-positive fallow, red and roe deer, and a single fox, stoat and muntjac. M. bovis spoligotypes occurred in a similar frequency of occurrence to that in cattle and badgers. Data on prevalence, pathology, abundance and ecology of wild mammals was integrated in a semi-quantitative risk assessment of the likelihood of transmission to cattle relative to badgers. Although most species presented a relatively low risk, higher values and uncertainty associated with muntjac, roe, red and in particular fallow deer, suggest they require further investigation. The results suggest that deer should be considered as potential, although probably localised, sources of infection for cattle. PMID:16434219

  1. Biological effect of low-head sea lamprey barriers: Designs for extensive surveys and the value of incorporating intensive process-oriented research

    USGS Publications Warehouse

    Hayes, D.B.; Baylis, J.R.; Carl, L.M.; Dodd, H.R.; Goldstein, J.D.; McLaughlin, R.L.; Noakes, D.L.G.; Porto, L.M.

    2003-01-01

    Four sampling designs for quantifying the effect of low-head sea lamprey (Petromyzon marinus) barriers on fish communities were evaluated, and the contribution of process-oriented research to the overall confidence of results obtained was discussed. The designs include: (1) sample barrier streams post-construction; (2) sample barrier and reference streams post-construction; (3) sample barrier streams pre- and post-construction; and (4) sample barrier and reference streams pre- and post-construction. In the statistical literature, the principal basis for comparison of sampling designs is generally the precision achieved by each design. In addition to precision, designs should be compared based on the interpretability of results and on the scale to which the results apply. Using data collected in a broad survey of streams with and without sea lamprey barriers, some of the tradeoffs that occur among precision, scale, and interpretability are illustrated. Although circumstances such as funding and availability of pre-construction data may limit which design can be implemented, a pre/post-construction design including barrier and reference streams provides the most meaningful information for use in barrier management decisions. Where it is not feasible to obtain pre-construction data, a design including reference streams is important to maintain the interpretability of results. Regardless of the design used, process-oriented research provides a framework for interpreting results obtained in broad surveys. As such, information from both extensive surveys and intensive process-oriented research provides the best basis for fishery management actions, and gives researchers and managers the most confidence in the conclusions reached regarding the effects of sea lamprey barriers.

  2. The Quantitative Reasoning for College Science (QuaRCS) Assessment, 1: Development and Validation

    E-print Network

    Follette, Katherine B; Dokter, Erin; Buxner, Sanlyn; Prather, Edward

    2015-01-01

    Science is an inherently quantitative endeavor, and general education science courses are taken by a majority of college students. As such, they are a powerful venue for advancing students' skills and attitudes toward mathematics. This article reports on the development and validation of the Quantitative Reasoning for College Science (QuaRCS) Assessment, a numeracy assessment instrument designed for college-level general education science students. It has been administered to more than four thousand students over eight semesters of refinement. We show that the QuaRCS is able to distinguish varying levels of quantitative literacy and present performance statistics for both individual items and the instrument as a whole. Responses from a survey of forty-eight Astronomy and Mathematics educators show that these two groups share views regarding which quantitative skills are most important in the contexts of science literacy and educated citizenship, and the skills assessed with the QuaRCS are drawn from these ran...

  3. Berkeley Quantitative Genome Browser

    Energy Science and Technology Software Center (ESTSC)

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation.more »The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.« less

  4. Geothermal energy as a source of electricity. A worldwide survey of the design and operation of geothermal power plants

    SciTech Connect

    DiPippo, R.

    1980-01-01

    An overview of geothermal power generation is presented. A survey of geothermal power plants is given for the following countries: China, El Salvador, Iceland, Italy, Japan, Mexico, New Zealand, Philippines, Turkey, USSR, and USA. A survey of countries planning geothermal power plants is included. (MHR)

  5. A Quantitative Multimodal Discourse Analysis of Teaching and Learning in a Web-Conferencing Environment--The Efficacy of Student-Centred Learning Designs

    ERIC Educational Resources Information Center

    Bower, Matt; Hedberg, John G.

    2010-01-01

    This paper presents a quantitative approach to multimodal discourse analysis for analyzing online collaborative learning. The coding framework draws together the fields of systemic functional linguistics and Activity Theory to analyze interactions between collaborative-, content- and technology-related discourse. The approach is used to examine…

  6. Identifying Influential Facilitators of Mathematics Professional Development: A Survey Analysis of Elementary School Teachers

    ERIC Educational Resources Information Center

    Linder, Sandra M.; Eckhoff, Angela; Igo, Larry B.; Stegelin, Dolores

    2013-01-01

    This paper builds on results from a previous phenomenological study examining characteristics of influential facilitators of elementary mathematics professional development. The current study utilized a survey design where results from the qualitative investigation were quantitized to develop an instrument that allowed participants to identify…

  7. 78 FR 5459 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ...Identification and assessment of patient-reported outcomes, such as pain, nausea and vomiting, deep vein thrombosis, infection, pneumonia, and urinary retention. We are looking for suggested topic areas, as well as any publicly available surveys,...

  8. 78 FR 5458 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ...Development of a Survey Regarding Patient and Family Member/Friend Experiences With Hospice...for information regarding patient and family member or close friend experiences with...quality health care for individuals, families, employers, and government. The...

  9. 77 FR 71600 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-03

    ...Development of a Survey Regarding Patient Experiences With Emergency Department Care AGENCY...information regarding consumer and patient experiences with emergency department care. DATES...focusing on the patient and caregiver experience, including those discussed later...

  10. Comparing Model-based and Design-based Structural Equation Modeling Approaches in Analyzing Complex Survey Data 

    E-print Network

    Wu, Jiun-Yu

    2011-10-21

    Conventional statistical methods assuming data sampled under simple random sampling are inadequate for use on complex survey data with a multilevel structure and non-independent observations. In structural equation modeling ...

  11. Computational vaccinology: quantitative approaches.

    PubMed

    Flower, Darren R; McSparron, Helen; Blythe, Martin J; Zygouri, Christianna; Taylor, Debra; Guan, Pingping; Wan, Shouzhan; Coveney, Peter V; Walshe, Valerie; Borrow, Persephone; Doytchinova, Irini A

    2003-01-01

    The immune system is hierarchical and has many levels, exhibiting much emergent behaviour. However, at its heart are molecular recognition events that are indistinguishable from other types of biomacromolecular interaction. These can be addressed well by quantitative experimental and theoretical biophysical techniques, and particularly by methods from drug design. We review here our approach to computational immunovaccinology. In particular, we describe the JenPep database and two new techniques for T cell epitope prediction. One is based on quantitative structure-activity relationships (a 3D-QSAR method based on CoMSIA and another 2D method based on the Free-Wilson approach) and the other on atomistic molecular dynamic simulations using high performance computing. JenPep (http://www.jenner.ar.uk/ JenPep) is a relational database system supporting quantitative data on peptide binding to major histocompatibility complexes, TAP transporters, TCR-pMHC complexes, and an annotated list of B cell and T cell epitopes. Our 2D-QSAR method factors the contribution to peptide binding from individual amino acids as well as 1-2 and 1-3 residue interactions. In the 3D-QSAR approach, the influence of five physicochemical properties (volume, electrostatic potential, hydrophobicity, hydrogen-bond donor and acceptor abilities) on peptide affinity were considered. Both methods are exemplified through their application to the well-studied problem of peptide binding to the human class I MHC molecule HLA-A*0201. PMID:14712934

  12. City Governments and Aging in Place: Community Design, Transportation and Housing Innovation Adoption

    ERIC Educational Resources Information Center

    Lehning, Amanda J.

    2012-01-01

    Purpose of the study: To examine the characteristics associated with city government adoption of community design, housing, and transportation innovations that could benefit older adults. Design and methods: A mixed-methods study with quantitative data collected via online surveys from 62 city planners combined with qualitative data collected via…

  13. EuropeaN Energy balance Research to prevent excessive weight Gain among Youth (ENERGY) project: Design and methodology of the ENERGY cross-sectional survey

    PubMed Central

    2011-01-01

    Background Obesity treatment is by large ineffective long term, and more emphasis on the prevention of excessive weight gain in childhood and adolescence is warranted. To inform energy balance related behaviour (EBRB) change interventions, insight in the potential personal, family and school environmental correlates of these behaviours is needed. Studies on such multilevel correlates of EBRB among schoolchildren in Europe are lacking. The ENERGY survey aims to (1) provide up-to-date prevalence rates of measured overweight, obesity, self-reported engagement in EBRBs, and objective accelerometer-based assessment of physical activity and sedentary behaviour and blood-sample biomarkers of metabolic function in countries in different regions of Europe, (2) to identify personal, family and school environmental correlates of these EBRBs. This paper describes the design, methodology and protocol of the survey. Method/Design A school-based cross-sectional survey was carried out in 2010 in seven different European countries; Belgium, Greece, Hungary, the Netherlands, Norway, Slovenia, and Spain. The survey included measurements of anthropometrics, child, parent and school-staff questionnaires, and school observations to measure and assess outcomes (i.e. height, weight, and waist circumference), EBRBs and potential personal, family and school environmental correlates of these behaviours including the social-cultural, physical, political, and economic environmental factors. In addition, a selection of countries conducted accelerometer measurements to objectively assess physical activity and sedentary behaviour, and collected blood samples to assess several biomarkers of metabolic function. Discussion The ENERGY survey is a comprehensive cross-sectional study measuring anthropometrics and biomarkers as well as assessing a range of EBRBs and their potential correlates at the personal, family and school level, among 10-12 year old children in seven European countries. This study will result in a unique dataset, enabling cross country comparisons in overweight, obesity, risk behaviours for these conditions as well as the correlates of engagement in these risk behaviours. PMID:21281466

  14. Quantitative Reasoning in Environmental Science: A Learning Progression

    ERIC Educational Resources Information Center

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  15. Detailed flow surveys of turning vanes designed for a 0.1-scale model of NASA Lewis Research Center's proposed altitude wind tunnel

    NASA Technical Reports Server (NTRS)

    Moore, Royce D.; Shyne, Rickey J.; Boldman, Donald R.; Gelder, Thomas F.

    1987-01-01

    Detailed flow surveys downstream of the corner turning vanes and downstream of the fan inlet guide vanes have been obtained in a 0.1-scale model of the NASA Lewis Research Center's proposed Altitude Wind Tunnel. Two turning vane designs were evaluated in both corners 1 and 2 (the corners between the test section and the drive fan). Vane A was a controlled-diffusion airfoil and vane B was a circular-arc airfoil. At given flows the turning vane wakes were surveyed to determine the vane pressure losses. For both corners the vane A turning vane configuration gave lower losses than the vane B configuration in the regions where the flow regime should be representative of two-dimensional flow. For both vane sets the vane loss coefficient increased rapidly near the walls.

  16. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  17. 78 FR 5458 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... Hospice Care AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Request for information... experiences with hospice care. DATES: The information solicited in this notice must be received at the address... HospiceSurvey@cms.hhs.gov or by postal mail at Centers for Medicare and Medicaid Services,...

  18. A WHOLE-LAKE WATER QUALITY SURVEY OF LAKE OAHE BASED ON A SPATIALLY-BALANCED PROBABILISTIC DESIGN

    EPA Science Inventory

    Assessing conditions on large bodies of water presets multiple statistical and logistical challenges. As part of the Upper Missouri River Program of the Environmental Monitoring and Assessment Project (EMAP) we surveyed water quality of Lake Oahe in July-August, 2002 using a spat...

  19. Evaluation of Nine Consensus Indices in Delphi Foresight Research and Their Dependency on Delphi Survey Characteristics: A Simulation Study and Debate on Delphi Design and Interpretation.

    PubMed

    Birko, Stanislav; Dove, Edward S; Özdemir, Vural

    2015-01-01

    The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger's Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss' Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts' opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0.087 and 0.083, respectively). Scholars in technology design, foresight research and future(s) studies might consider these new findings in strategic planning of Delphi studies, for example, in rational choice of consensus indices and sample size, or accounting for confounding factors such as experts' variable degrees of conformity (stubbornness/flexibility) in modifying their opinions. PMID:26270647

  20. Evaluation of Nine Consensus Indices in Delphi Foresight Research and Their Dependency on Delphi Survey Characteristics: A Simulation Study and Debate on Delphi Design and Interpretation

    PubMed Central

    Birko, Stanislav; Dove, Edward S.; Özdemir, Vural

    2015-01-01

    The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger’s Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss’ Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts’ opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0.087 and 0.083, respectively). Scholars in technology design, foresight research and future(s) studies might consider these new findings in strategic planning of Delphi studies, for example, in rational choice of consensus indices and sample size, or accounting for confounding factors such as experts’ variable degrees of conformity (stubbornness/flexibility) in modifying their opinions. PMID:26270647

  1. Limitations of the House of Quality to provide quantitative

    E-print Network

    Lewis, Kemper E.

    Limitations of the House of Quality to provide quantitative design information Andrew Olewnik-SUNY, Buffalo, New York, USA Abstract Purpose ­ The House of Quality (HoQ) is a popular design tool quantitative information for design. Keywords House of quality, Quality function deployment, Design

  2. The Math You Need, When You Need It: Student-Centered Web Resources Designed to Decrease Math Review and Increase Quantitative Geology in the Classroom

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Baer, E. M.

    2007-12-01

    Introductory geoscience courses are rife with quantitative concepts from graphing to rates to unit conversions. Recent research suggests that supplementary mathematical instruction increases post-secondary students' retention and performance in science courses. Nonetheless, many geoscience faculty feel that they do not have enough time to cover all the geoscience content, let alone covering the math they often feel students should have learned before reaching their classes. We present our NSF-funded effort to create web modules for students that address these concerns. Our web resources focus on both student performance and faculty time issues by building students' quantitative skills through web-based, self-paced modular tutorials. Each module can be assigned to individual students who have demonstrated on a pre-test that they are in need of supplemental instruction. The pre-test involves problems that place mathematical concepts in a geoscience context and determines the students who need the most support with these skills. Students needing support are asked to complete a three-pronged web-based module just before the concept is needed in class. The three parts of each tutorial include: an explanation of the mathematics, a page of practice problems and an on-line quiz that is graded and sent to the instructor. Each of the modules is steeped in best practices in mathematics and geoscience education, drawing on multiple contexts and utilizing technology. The tutorials also provide students with further resources so that they can explore the mathematics in more depth. To assess the rigor of this program, students are given the pre-test again at the end of the course. The uniqueness of this program lies in a rich combination of mathematical concepts placed in multiple geoscience contexts, giving students the opportunity to explore the way that math relates to the physical world. We present several preliminary modules dealing with topics common in introductory geoscience courses. We seek feedback from faculty teaching all levels of geoscience addressing several questions: In what math/geoscience topics do you feel students need supplemental instruction? Where do students come up against quantitative topics that make them drop the class or perform poorly? Would you be willing to review or help us to test these modules in your class?

  3. Doing Quantitative Research in Education with SPSS

    ERIC Educational Resources Information Center

    Muijs, Daniel

    2004-01-01

    This book looks at quantitative research methods in education. The book is structured to start with chapters on conceptual issues and designing quantitative research studies before going on to data analysis. While each chapter can be studied separately, a better understanding will be reached by reading the book sequentially. This book is intended…

  4. Design and Implementation of a Comprehensive Web-based Survey for Ovarian Cancer Survivorship with an Analysis of Prediagnosis Symptoms via Text Mining

    PubMed Central

    Sun, Jiayang; Bogie, Kath M; Teagno, Joe; Sun, Yu-Hsiang (Sam); Carter, Rebecca R; Cui, Licong; Zhang, Guo-Qiang

    2014-01-01

    Ovarian cancer (OvCa) is the most lethal gynecologic disease in the United States, with an overall 5-year survival rate of 44.5%, about half of the 89.2% for all breast cancer patients. To identify factors that possibly contribute to the long-term survivorship of women with OvCa, we conducted a comprehensive online Ovarian Cancer Survivorship Survey from 2009 to 2013. This paper presents the design and implementation of our survey, introduces its resulting data source, the OVA-CRADLE™ (Clinical Research Analytics and Data Lifecycle Environment), and illustrates a sample application of the survey and data by an analysis of prediagnosis symptoms, using text mining and statistics. The OVA-CRADLE™ is an application of our patented Physio-MIMI technology, facilitating Web-based access, online query and exploration of data. The prediagnostic symptoms and association of early-stage OvCa diagnosis with endometriosis provide potentially important indicators for future studies in this field. PMID:25861211

  5. Electronic Survey Methodology Page 1 Electronic Survey Methodology

    E-print Network

    Nonnecke, Blair

    Electronic Survey Methodology Page 1 Electronic Survey Methodology: A Case Study in Reaching Hard, Maryland preece@umbc.edu 2002 © Andrews, Nonnecke and Preece #12;Electronic Survey Methodology Page 2 Conducting Research on the Internet: Electronic survey Design, Development and Implementation Guidelines

  6. CISE REU Student Survey 2014CISE REU Student Survey 2014CISE REU Student Survey 2014CISE REU Student Survey 2014 Informed Consent for CISE REU Assessment

    E-print Network

    Dahlberg, Teresa A.

    Page 1 CISE REU Student Survey 2014CISE REU Student Survey 2014CISE REU Student Survey 2014CISE REU Student Survey 2014 Informed Consent for CISE REU Assessment Project Title and Purpose You are invited to take a few minutes to complete a brief survey. This survey is designed to evaluate the effectiveness

  7. Three-dimensional quantitative structure-activity relationships and docking studies of some structurally diverse flavonoids and design of new aldose reductase inhibitors.

    PubMed

    Chandra De, Utpal; Debnath, Tanusree; Sen, Debanjan; Debnath, Sudhan

    2015-01-01

    Aldose reductase (AR) plays an important role in the development of several long-term diabetic complications. Inhibition of AR activities is a strategy for controlling complications arising from chronic diabetes. Several AR inhibitors have been reported in the literature. Flavonoid type compounds are shown to have significant AR inhibition. The objective of this study was to perform a computational work to get an idea about structural insight of flavonoid type compounds for developing as well as for searching new flavonoid based AR inhibitors. The data-set comprising 68 flavones along with their pIC50 values ranging from 0.44 to 4.59 have been collected from literature. Structure of all the flavonoids were drawn in Chembiodraw Ultra 11.0, converted into corresponding three-dimensional structure, saved as mole file and then imported to maestro project table. Imported ligands were prepared using LigPrep option of maestro 9.6 version. Three-dimensional quantitative structure-activity relationships and docking studies were performed with appropriate options of maestro 9.6 version installed in HP Z820 workstation with CentOS 6.3 (Linux). A model with partial least squares factor 5, standard deviation 0.2482, R(2) = 0.9502 and variance ratio of regression 122 has been found as the best statistical model. PMID:25709964

  8. Design, objectives, and lessons from a pilot 25 year follow up re- survey of survivors in the Whitehall study of London Civil Servants

    PubMed Central

    Clarke, R.; Breeze, E.; Sherliker, P.; Shipley, M.; Youngman, L.; Fletcher, A.; Fuhrer, R.; Leon, D.; Parish, S.; Collins, R.; Marmot, M.

    1998-01-01

    DESIGN: To assess the feasibility of conducting a re-survey of men who are resident in the United Kingdom 25 years after enrollment in the Whitehall study of London Civil Servants. METHODS: A random sample of 401 study survivors resident in three health authority areas was selected for this pilot study. They were mailed a request to complete a self administered questionnaire, and then asked to attend their general practice to have their blood pressure, weight, and height measured and a blood sample collected into a supplied vacutainer, and mailed to a central laboratory. Using a 2 x 2 factorial design, the impact of including additional questions on income and of an informant questionnaire on cognitive function was assessed. RESULTS: Accurate addresses were obtained from the health authorities for 96% of the sample. Questionnaires were received from 73% and blood samples from 61% of the sample. Questions on income had no adverse effect on the response rate, but inclusion of the informant questionnaire did. Between 1970 and 1995 there were substantial changes within men in the mean blood pressure and blood total cholesterol recorded, as reflected by correlation coefficients between 1970 and 1995 values of 0.26, and 0.30 for systolic and diastolic blood pressure and 0.38 for total cholesterol. CONCLUSION: This pilot study demonstrated the feasibility of conducting a re-survey using postal questionnaires and mailed whole blood samples. The magnitude of change in blood pressure and blood total cholesterol concentrations within individuals was greater than anticipated, suggesting that such remeasurements may be required at different intervals in prospective studies to help interpret risks associations properly. These issues will be considered in a re-survey of the remaining survivors of the Whitehall study.   PMID:9764257

  9. Obesity-related behaviours and BMI in five urban regions across Europe: sampling design and results from the SPOTLIGHT cross-sectional survey

    PubMed Central

    Lakerveld, Jeroen; Ben Rebah, Maher; Mackenbach, Joreintje D; Charreire, Hélène; Compernolle, Sofie; Glonti, Ketevan; Bardos, Helga; Rutter, Harry; De Bourdeaudhuij, Ilse; Brug, Johannes; Oppert, Jean-Michel

    2015-01-01

    Objectives To describe the design, methods and first results of a survey on obesity-related behaviours and body mass index (BMI) in adults living in neighbourhoods from five urban regions across Europe. Design A cross-sectional observational study in the framework of an European Union-funded project on obesogenic environments (SPOTLIGHT). Setting 60 urban neighbourhoods (12 per country) were randomly selected in large urban zones in Belgium, France, Hungary, the Netherlands and the UK, based on high or low values for median household income (socioeconomic status, SES) and residential area density. Participants A total of 6037 adults (mean age 52?years, 56% female) participated in the online survey. Outcome measures Self-reported physical activity, sedentary behaviours, dietary habits and BMI. Other measures included general health; barriers and motivations for a healthy lifestyle, perceived social and physical environmental characteristics; the availability of transport modes and their use to specific destinations; self-defined neighbourhood boundaries and items related to residential selection. Results Across five countries, residents from low-SES neighbourhoods ate less fruit and vegetables, drank more sugary drinks and had a consistently higher BMI. SES differences in sedentary behaviours were observed in France, with residents from higher SES neighbourhoods reporting to sit more. Residents from low-density neighbourhoods were less physically active than those from high-density neighbourhoods; during leisure time and (most pronounced) for transport (except for Belgium). BMI differences by residential density were inconsistent across all countries. Conclusions The SPOTLIGHT survey provides an original approach for investigating relations between environmental characteristics, obesity-related behaviours and obesity in Europe. First descriptive results indicate considerable differences in health behaviours and BMI between countries and neighbourhood types. PMID:26507356

  10. Cartography at the U.S. Geological Survey: the National Mapping Division's cartographic programs, products, design, and technology

    USGS Publications Warehouse

    Ogrosky, Charles E.; Gwynn, William; Jannace, Richard

    1989-01-01

    The U.S. Geological Survey (USGS) is the prime source of many kinds of topographic and special-purpose maps of the United States and its outlying areas. It is also a prime source of digital map data. One main goal of the USGS is to provide large-scale topographic map coverage of the entire United States. Most of the Nation is already covered. We expect that initial coverage will be completed by 1991. For many purposes, many public agencies, private organizations, and individuals need reliable cartographic and geographic knowledge about our Nation. To serve such needs, all USGS maps are compiled to exacting standards of accuracy and content.

  11. QUANTITATIVE DECISION TOOLS AND MANAGEMENT DEVELOPMENT PROGRAMS.

    ERIC Educational Resources Information Center

    BYARS, LLOYD L.; NUNN, GEOFFREY E.

    THIS ARTICLE OUTLINED THE CURRENT STATUS OF QUANTITATIVE METHODS AND OPERATIONS RESEARCH (OR), SKETCHED THE STRENGTHS OF TRAINING EFFORTS AND ISOLATED WEAKNESSES, AND FORMULATED WORKABLE CRITERIA FOR EVALUATING SUCCESS OF OPERATIONS RESEARCH TRAINING PROGRAMS. A SURVEY OF 105 COMPANIES REVEALED THAT PERT, INVENTORY CONTROL THEORY AND LINEAR…

  12. Developing Geoscience Students' Quantitative Skills

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2005-12-01

    Sophisticated quantitative skills are an essential tool for the professional geoscientist. While students learn many of these sophisticated skills in graduate school, it is increasingly important that they have a strong grounding in quantitative geoscience as undergraduates. Faculty have developed many strong approaches to teaching these skills in a wide variety of geoscience courses. A workshop in June 2005 brought together eight faculty teaching surface processes and climate change to discuss and refine activities they use and to publish them on the Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills) for broader use. Workshop participants in consultation with two mathematics faculty who have expertise in math education developed six review criteria to guide discussion: 1) Are the quantitative and geologic goals central and important? (e.g. problem solving, mastery of important skill, modeling, relating theory to observation); 2) Does the activity lead to better problem solving? 3) Are the quantitative skills integrated with geoscience concepts in a way that makes sense for the learning environment and supports learning both quantitative skills and geoscience? 4) Does the methodology support learning? (e.g. motivate and engage students; use multiple representations, incorporate reflection, discussion and synthesis) 5) Are the materials complete and helpful to students? 6) How well has the activity worked when used? Workshop participants found that reviewing each others activities was very productive because they thought about new ways to teach and the experience of reviewing helped them think about their own activity from a different point of view. The review criteria focused their thinking about the activity and would be equally helpful in the design of a new activity. We invite a broad international discussion of the criteria(serc.Carleton.edu/quantskills/workshop05/review.html).The Teaching activities can be found on the Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills/). In addition to the teaching activity collection (85 activites), this site contains a variety of resources to assist faculty with the methods they use to teach quantitative skills at both the introductory and advanced levels; information about broader efforts in quantitative literacy involving other science disciplines, and a special section of resources for students who are struggling with their quantitative skills. The site is part of the Digital Library for Earth Science Education and has been developed by geoscience faculty in collaboration with mathematicians and mathematics educators with funding from the National Science Foundation.

  13. Quantitative research and issues of political sensitivity in rural China

    E-print Network

    Tsai, Lily L.

    Political sensitivity is always a challenge for the scholar doing fieldwork in nondemocratic and transitional systems, especially when doing surveys and quantitative research. Not only are more research topics likely to ...

  14. Establishment of a 100-seed weight quantitative trait locus-allele matrix of the germplasm population for optimal recombination design in soybean breeding programmes.

    PubMed

    Zhang, Yinghu; He, Jianbo; Wang, Yufeng; Xing, Guangnan; Zhao, Jinming; Li, Yan; Yang, Shouping; Palmer, R G; Zhao, Tuanjie; Gai, Junyi

    2015-09-01

    A representative sample comprising 366 accessions from the Chinese soybean landrace population (CSLRP) was tested under four growth environments for determination of the whole-genome quantitative trait loci (QTLs) system of the 100-seed weight trait (ranging from 4.59g to 40.35g) through genome-wide association study (GWAS). A total of 116 769 single nucleotide polymorphisms (SNPs) were identified and organized into 29 121 SNP linkage disequilibrium blocks (SNPLDBs) to fit the property of multiple alleles/haplotypes per locus in germplasm. An innovative two-stage GWAS was conducted using a single locus model for shrinking the marker number followed by a multiple loci model utilizing a stepwise regression for the whole-genome QTL identification. In total, 98.45% of the phenotypic variance (PV) was accounted for by four large-contribution major QTLs (36.33%), 51 small-contribution major QTLs (43.24%), and a number of unmapped minor QTLs (18.88%), with the QTL×environment variance representing only 1.01% of the PV. The allele numbers of each QTL ranged from two to 10. A total of 263 alleles along with the respective allele effects were estimated and organized into a 263×366 matrix, giving the compact genetic constitution of the CSLRP. Differentiations among the ecoregion matrices were found. No landrace had alleles which were all positive or all negative, indicating a hidden potential for recombination. The optimal crosses within and among ecoregions were predicted, and showed great transgressive potential. From the QTL system, 39 candidate genes were annotated, of which 26 were involved with the gene ontology categories of biological process, cellular component, and molecular function, indicating that diverse genes are involved in directing the 100-seed weight. PMID:26163701

  15. The FLAMINGOS Extragalactic Survey

    NASA Astrophysics Data System (ADS)

    Gonzalez, A. H.; Elston, R. J.; Eisenhardt, P. R. M.; Lin, Y.; McKenzie, E. H.; Mohr, J. J.; Stanford, S. A.; Stern, D.

    2003-12-01

    The FLAMINGOS Extragalactic Survey is a deep, near-infrared imaging survey designed to study galaxy evolution and the galaxy cluster population out to z ˜2. When complete the survey will cover 10 deg2 down to J=22 and Ks=20.5 (AB magnitudes), with the survey area consisting of two contiguous 5 deg2 in the NDWFS region. In this talk I present a brief overview of the survey and target science, including the current status of the program and preliminary results. The author acknowledges support from an NSF Astronomy and Astrophysics Postdoctoral Fellowship.

  16. Development and validation of LC-MS/MS method for the quantitation of lenalidomide in human plasma using Box-Behnken experimental design.

    PubMed

    Hasnain, M Saquib; Rao, Shireen; Singh, Manoj Kr; Vig, Nitin; Gupta, Amit; Ansari, Abdulla; Sen, Pradeep; Joshi, Pankaj; Ansari, Shaukat Ali

    2013-03-01

    For the determination of lenalidomide using carbamazepine as an internal standard, an ultra-fast stability indicating LC-MS/MS method was developed, validated and optimized to support clinical advancement. The samples were prepared by solid-phase extraction. The calibration range was 2-1000 ng mL(-1), for which a quadratic regression (1/x(2)) was best fitted. The method was validated and a 3(2) factorial was employed using Box-Behnken experimental design for the validation of robustness. These designs have three factors such as mobile phase composition (X(1)), flow rate (X(2)) and pH (X(3)) while peak area (Y(1)) and retention time (Y(2)) were taken as response. This showed that little changes in mobile phase and flow rate affect the response while pH has no affect. Lenalidomide and carbamazepine were stable in human plasma after five freeze thaw cycles, at room temperature for 23.7 h, bench top stability for 6.4 h. This method competes with all the regulatory requirements for selectivity, sensitivity, precision, accuracy, and stability for the determination of lenalidomide in human plasma, as well as being highly sensitive and effective for the pharmacokinetic and bioequivalence study of lenalidomide. PMID:23323263

  17. Web Survey Design in ASP.Net 2.0: A Simple Task with One Line of Code

    ERIC Educational Resources Information Center

    Liu, Chang

    2007-01-01

    Over the past few years, more and more companies have been investing in electronic commerce (EC) by designing and implementing Web-based applications. In the world of practice, the importance of using Web technology to reach individual customers has been presented by many researchers. This paper presents an easy way of conducting marketing…

  18. Measuring health literacy in populations: illuminating the design and development process of the European Health Literacy Survey Questionnaire (HLS-EU-Q)

    PubMed Central

    2013-01-01

    Background Several measurement tools have been developed to measure health literacy. The tools vary in their approach and design, but few have focused on comprehensive health literacy in populations. This paper describes the design and development of the European Health Literacy Survey Questionnaire (HLS-EU-Q), an innovative, comprehensive tool to measure health literacy in populations. Methods Based on a conceptual model and definition, the process involved item development, pre-testing, field-testing, external consultation, plain language check, and translation from English to Bulgarian, Dutch, German, Greek, Polish, and Spanish. Results The development process resulted in the HLS-EU-Q, which entailed two sections, a core health literacy section and a section on determinants and outcomes associated to health literacy. The health literacy section included 47 items addressing self-reported difficulties in accessing, understanding, appraising and applying information in tasks concerning decisions making in healthcare, disease prevention, and health promotion. The second section included items related to, health behaviour, health status, health service use, community participation, socio-demographic and socio-economic factors. Conclusions By illuminating the detailed steps in the design and development process of the HLS-EU-Q, it is the aim to provide a deeper understanding of its purpose, its capability and its limitations for others using the tool. By stimulating a wide application it is the vision that HLS-EU-Q will be validated in more countries to enhance the understanding of health literacy in different populations. PMID:24112855

  19. Bright galaxies at Hubble's redshift detection frontier: Preliminary results and design from the redshift z~9-10 BoRG pure-parallel HST survey

    E-print Network

    Calvi, V; Stiavelli, M; Oesch, P; Bradley, L D; Schmidt, K B; Coe, D; Brammer, G; Bernard, S; Bouwens, R J; Carrasco, D; Carollo, C M; Holwerda, B W; MacKenty, J W; Mason, C A; Shull, J M; Treu, T

    2015-01-01

    We present the first results and design from the redshift z~9-10 Brightest of the Reionizing Galaxies {\\it Hubble Space Telescope} survey BoRG[z9-10], aimed at searching for intrinsically luminous unlensed galaxies during the first 700 Myr after the Big Bang. BoRG[z9-10] is the continuation of a multi-year pure-parallel near-IR and optical imaging campaign with the Wide Field Camera 3. The ongoing survey uses five filters, optimized for detecting the most distant objects and offering continuous wavelength coverage from {\\lambda}=0.35{\\mu}m to {\\lambda}=1.7{\\mu}m. We analyze the initial ~130 arcmin$^2$ of area over 28 independent lines of sight (~25% of the total planned) to search for z>7 galaxies using a combination of Lyman break and photometric redshift selections. From an effective comoving volume of (5-25) $times 10^5$ Mpc$^3$ for magnitudes brighter than $m_{AB}=26.5-24.0$ in the $H_{160}$-band respectively, we find five galaxy candidates at z~8.3-10 detected at high confidence (S/N>8), including a sour...

  20. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  1. The U. S. Geological Survey's Albemarle-Pamlico National Water-Quality Assessment Study; background and design

    USGS Publications Warehouse

    Spruill, T.B.; Harned, Douglas A.; McMahon, Gerard

    1995-01-01

    The Albemarle-Pamlico Study Unit is one of 20 National Water-Quality Assessment (NAWQA) studies begun in 1991 by the U.S. Geological Survey (USGS) to assess the Nation's water quality. One of the missions of the USGS is to assess the quantity and quality of the Nation's water resources. The NAWQA program was established to help accomplish this mission. The Albemarle-Pamlico Study Unit, located in Virginia and North Carolina, drains an area of about 28,000 square miles. Four major rivers, the Chowan, the Roanoke, the Tar-Pamlico and the Neuse, all drain into the Albemarle-Pamlico Sound in North Carolina. Four physiographic regions (areas of homogeneous climatic, geologic, and biological characteristics), the Valley and Ridge, Blue Ridge, Piedmont and Coastal Plain Physiographic Provinces are included within the Albemarle-Pamlico Study Unit. Until 1991, there was no single program that could answer the question, 'Are the Nation's ground and surface waters getting better, worse, or are they staying the same?' A program was needed to evaluate water quality by using standard techniques to allow assessment of water quality at local, regional, and national scales. The NAWQA Program was implemented to answer questions about the Nation's water quality using consistent and comparable methods. A total of 60 basins, or study units, will be in place by 1997 to assess the Nation's water quality.

  2. Quantitative and qualitative optimization of allergen extraction from peanut and selected tree nuts. Part 1. Screening of optimal extraction conditions using a D-optimal experimental design.

    PubMed

    L'Hocine, Lamia; Pitre, Mélanie

    2016-03-01

    A D-optimal design was constructed to optimize allergen extraction efficiency simultaneously from roasted, non-roasted, defatted, and non-defatted almond, hazelnut, peanut, and pistachio flours using three non-denaturing aqueous (phosphate, borate, and carbonate) buffers at various conditions of ionic strength, buffer-to-protein ratio, extraction temperature, and extraction duration. Statistical analysis showed that roasting and non-defatting significantly lowered protein recovery for all nuts. Increasing the temperature and the buffer-to-protein ratio during extraction significantly increased protein recovery, whereas increasing the extraction time had no significant impact. The impact of the three buffers on protein recovery varied significantly among the nuts. Depending on the extraction conditions, protein recovery varied from 19% to 95% for peanut, 31% to 73% for almond, 17% to 64% for pistachio, and 27% to 88% for hazelnut. A modulation by the buffer type and ionic strength of protein and immunoglobuline E binding profiles of extracts was evidenced, where high protein recovery levels did not always correlate with high immunoreactivity. PMID:26471618

  3. Quantitative and qualitative optimization of allergen extraction from peanut and selected tree nuts. Part 2. Optimization of buffer and ionic strength using a full factorial experimental design.

    PubMed

    L'Hocine, Lamia; Pitre, Mélanie

    2016-03-01

    A full factorial design was used to assess the single and interactive effects of three non-denaturing aqueous (phosphate, borate, and carbonate) buffers at various ionic strengths (I) on allergen extractability from and immunoglobulin E (IgE) immunoreactivity of peanut, almond, hazelnut, and pistachio. The results indicated that the type and ionic strength of the buffer had different effects on protein recovery from the nuts under study. Substantial differences in protein profiles, abundance, and IgE-binding intensity with different combinations of pH and ionic strength were found. A significant interaction between pH and ionic strength was observed for pistachio and almond. The optimal buffer system conditions, which maximized the IgE-binding efficiency of allergens and provided satisfactory to superior protein recovery yield and profiles, were carbonate buffer at an ionic strength of I=0.075 for peanut, carbonate buffer at I=0.15 for almond, phosphate buffer at I=0.5 for hazelnut, and borate at I=0.15 for pistachio. The buffer type and its ionic strength could be manipulated to achieve the selective solubility of desired allergens. PMID:26471623

  4. Nursing Undergraduate Alumni Survey

    E-print Network

    Plotkin, Joshua B.

    Nursing Undergraduate Alumni Survey 1997 & 2002 Methodology The Nursing Alumni Survey was designed in a graduate program at the time of survey completion. The most popular advanced degree was clearly an MSN.5%) 6 (20.0%) 2 (5.9%) Research 5 (7.8%) 2 (6.7%) 3 (8.8%) Teaching/Faculty 2 (3.1%) 2 (6.7%) 0 (0

  5. Population and Star Formation Histories from the Outer Limits Survey

    NASA Astrophysics Data System (ADS)

    Brondel, Brian Joseph; Saha, Abhijit; Olszewski, Edward

    2015-08-01

    The Outer Limits Survey (OLS) is a deep survey of selected fields in the outlying areas of the Magellanic Clouds based on the MOSAIC-II instrument on the Blanco 4-meter Telescope at CTIO. OLS is designed to probe the outer disk and halo structures of Magellanic System. The survey comprises ~50 fields obtained in Landolt R, I and Washington C, M and DDO51 filters, extending to a depth of about 24th magnitude in I. While qualitative examination of the resulting data has yielded interesting published results, we report here on quantitative analysis through matching of Hess diagrams to theoretical isochrones. We present analysis based on techniques developed by Dolphin (e.g., 2002, MNRAS, 332, 91) for fields observed by OLS. Our results broadly match those found by qualitative examination of the CMDs, but interesting details emerge from isochrone fitting.

  6. The path of placement of a removable partial denture: a microscope based approach to survey and design.

    PubMed

    Mamoun, John Sami

    2015-02-01

    This article reviews the topic of how to identify and develop a removable partial denture (RPD) path of placement, and provides a literature review of the concept of the RPD path of placement, also known as the path of insertion. An optimal RPD path of placement, guided by mutually parallel guide planes, ensures that the RPD flanges fit intimately over edentulous ridge structures and that the framework fits intimately with guide plane surfaces, which prevents food collecting empty spaces between the intaglio surface of the framework and intraoral surfaces, and ensures that RPD clasps engage adequate numbers of tooth undercuts to ensure RPD retention. The article covers topics such as the causes of obstructions to RPD intra-oral seating, the causes of food collecting empty spaces that may exist around an RPD, and how to identify if a guide plane is parallel with the projected RPD path of placement. The article presents a method of using a surgical operating microscope, or high magnification (6-8x or greater) binocular surgical loupes telescopes, combined with co-axial illumination, to identify a preliminary path of placement for an arch. This preliminary path of placement concept may help to guide a dentist or a dental laboratory technician when surveying a master cast of the arch to develop an RPD path of placement, or in verifying that intra-oral contouring has aligned teeth surfaces optimally with the RPD path of placement. In dentistry, a well-fitting RPD reduces long-term periodontal or structural damage to abutment teeth. PMID:25722842

  7. The PdBI arcsecond whirlpool survey (PAWS). I. A cloud-scale/multi-wavelength view of the interstellar medium in a grand-design spiral galaxy

    SciTech Connect

    Schinnerer, Eva; Meidt, Sharon E.; Hughes, Annie; Colombo, Dario; Pety, Jérôme; Schuster, Karl F.; Dumas, Gaëlle; García-Burillo, Santiago; Dobbs, Clare L.; Leroy, Adam K.; Kramer, Carsten; Thompson, Todd A.; Regan, Michael W.

    2013-12-10

    The Plateau de Bure Interferometer Arcsecond Whirlpool Survey has mapped the molecular gas in the central ?9 kpc of M51 in its {sup 12}CO(1-0) line emission at a cloud-scale resolution of ?40 pc using both IRAM telescopes. We utilize this data set to quantitatively characterize the relation of molecular gas (or CO emission) to other tracers of the interstellar medium, star formation, and stellar populations of varying ages. Using two-dimensional maps, a polar cross-correlation technique and pixel-by-pixel diagrams, we find: (1) that (as expected) the distribution of the molecular gas can be linked to different components of the gravitational potential; (2) evidence for a physical link between CO line emission and radio continuum that seems not to be caused by massive stars, but rather depends on the gas density; (3) a close spatial relation between polycyclic aromatic hydrocarbon (PAH) and molecular gas emission, but no predictive power of PAH emission for the molecular gas mass; (4) that the I – H color map is an excellent predictor of the distribution (and to a lesser degree, the brightness) of CO emission; and (5) that the impact of massive (UV-intense) young star-forming regions on the bulk of the molecular gas in central ?9 kpc cannot be significant due to a complex spatial relation between molecular gas and star-forming regions that ranges from cospatial to spatially offset to absent. The last point, in particular, highlights the importance of galactic environment—and thus the underlying gravitational potential—for the distribution of molecular gas and star formation.

  8. DRAFT - Design of Radiological Survey and Sampling to Support Title Transfer or Lease of Property on the Department of Energy Oak Ridge Reservation

    SciTech Connect

    Cusick L.T.

    2002-09-25

    The U.S. Department of Energy (DOE) owns, operates, and manages the buildings and land areas on the Oak Ridge Reservation (ORR) in Oak Ridge, Tennessee. As land and buildings are declared excess or underutilized, it is the intent of DOE to either transfer the title of or lease suitable property to the Community Reuse Organization of East Tennessee (CROET) or other entities for public use. It is DOE's responsibility, in coordination with the U.S. Environmental Protection Agency (EPA), Region 4, and the Tennessee Department of Environment and Conservation (TDEC), to ensure that the land, facilities, and personal property that are to have the title transferred or are to be leased are suitable for public use. Release of personal property must also meet site requirements and be approved by the DOE contractor responsible for site radiological control. The terms title transfer and lease in this document have unique meanings. Title transfer will result in release of ownership without any restriction or further control by DOE. Under lease conditions, the government retains ownership of the property along with the responsibility to oversee property utilization. This includes involvement in the lessee's health, safety, and radiological control plans and conduct of site inspections. It may also entail lease restrictions, such as limiting access to certain areas or prohibiting digging, drilling, or disturbing material under surface coatings. Survey and sampling requirements are generally more rigorous for title transfer than for lease. Because of the accelerated clean up process, there is an increasing emphasis on title transfers of facilities and land. The purpose of this document is to describe the radiological survey and sampling protocols that are being used for assessing the radiological conditions and characteristics of building and land areas on the Oak Ridge Reservation that contain space potentially available for title transfer or lease. After necessary surveys and sampling and laboratory analyses are completed, the data are analyzed and included in an Environmental Baseline Summary (EBS) report for title transfer or in a Baseline Environmental Analysis Report (BEAR) for lease. The data from the BEAR is then used in a Screening-Level Human Health Risk Assessment (SHHRA) or a risk calculation (RC) to assess the potential risks to future owners/occupants. If title is to be transferred, release criteria in the form of specific activity concentrations called Derived Concentration Guideline Levels (DCGLs) will be developed for the each property. The DCGLs are based on the risk model and are used with the data in the EBS to determine, with statistical confidence, that the release criteria for the property have been met. The goal of the survey and sampling efforts is to (1) document the baseline conditions of the property (real or personal) prior to title transfer or lease, (2) obtain enough information that an evaluation of radiological risks can be made, and (3) collect sufftcient data so that areas that contain minimal residual levels of radioactivity can be identified and, following radiological control procedures, be released from radiological control. (It should be noted that release from radiological control does not necessarily mean free release because DOE may maintain institutional control of the site after it is released from radiological control). To meet the goals of this document, a Data Quality Objective (DQO) process will be used to enhance data collection efficiency and assist with decision-making. The steps of the DQO process involve stating the problem, identifying the decision, identifying inputs to the decision, developing study boundaries, developing the decision rule, and optimizing the design. This document describes the DQOs chosen for surveys and sampling efforts performed for the purposes listed above. The previous version to this document focused on the requirements for radiological survey and sampling protocols that are be used for leasing. Because the primary focus at this time is on title transfer, th

  9. NATIONAL COMORBIDITY SURVEY (NCS)

    EPA Science Inventory

    The National Comorbidity Survey (NCS) was a collaborative epidemiologic investigation designed to study the prevalence and correlates of DSM III-R disorders and patterns and correlates of service utilization for these disorders. The NCS was the first survey to administer a struct...

  10. AIPS technology survey report

    NASA Technical Reports Server (NTRS)

    Ogletree, Glenn (editor)

    1984-01-01

    The results of a technology survey conducted for the NASA/JSC by the CSDL during Phase 1 of the NASA Advanced Information Processing System (AIPS) program at the CSDL are discussed. The purpose of the survey was to ensure that all technology relevant to the configuration, design, development, verification, implementation, and validation of an advanced information processing system, whether existing or under development and soon to be available, would be duly considered in the development of the AIPS. The emphasis in the survey was on technology items which were clearly relevant to the AIPS. Requirements were developed which guided the planning of contacts with the outside sources to be surveyed, and established practical limits on the scope and content of the Technology Survey. Subjects surveyed included architecture, software, hardware, methods for evaluation of reliability and performance, and methods for the verification of the AIPS design and the validation of the AIPS implementation. Survey requirements and survey results in each of these areas are presented, including analyses of the potential effects on the AIPS development process of using or not using the surveyed technology items. Another output of the survey was the identification of technology areas of particular relevance to the AIPS and for which further development, in some cases by the CSDL and in some cases by the NASA, would be fruitful. Appendices are provided in which are presented: (1) reports of some of the actual survey interactions with industrial and other outside information sources; (2) the literature list from the comprehensive literature survey which was conducted; (3) reduced-scale images of an excerpt ('Technology Survey' viewgraphs) from the set of viewgraphs used at the 14 April 1983 Preliminary Requirements Review by the CSDL for the NASA; and (4) reduced-scale images of the set of viewgraphs used in the AIPS Technology Survey Review presentation to the NASA monitors by the CSDL at the NASA Langley Research Center on 28 Sep. 1983.

  11. Design

    ERIC Educational Resources Information Center

    Buchanan, Richard; Cross, Nigel; Durling, David; Nelson, Harold; Owen, Charles; Valtonen, Anna; Boling, Elizabeth; Gibbons, Andrew; Visscher-Voerman, Irene

    2013-01-01

    Scholars representing the field of design were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Richard Buchanan, Nigel Cross, David Durling, Harold Nelson, Charles Owen, and Anna Valtonen. Scholars…

  12. Feedwater heater survey

    SciTech Connect

    Eberle, H.; von Boeckh, P. ); Diaz-Tous, I. ); Bell, R.J. )

    1991-08-01

    Results of a utility survey of high-pressure feedwater heaters are compiled and evaluated, and recommendations are made for the design of heaters for future power plants or for retrofits of existing plants. 2 figs., 20 tabs.

  13. Flat conductor cable survey

    NASA Technical Reports Server (NTRS)

    Swanson, C. R.; Walker, G. L.

    1973-01-01

    Design handbook contains data and illustrations concerned with commercial and Government flat-conductor-cable connecting and terminating hardware. Material was obtained from a NASA-sponsored industry-wide survey of approximately 150 companies and Government agencies.

  14. Prototype ultrasonic instrument for quantitative testing

    NASA Technical Reports Server (NTRS)

    Lynnworth, L. C.; Dubois, J. L.; Kranz, P. R.

    1972-01-01

    A prototype ultrasonic instrument has been designed and developed for quantitative testing. The complete delivered instrument consists of a pulser/receiver which plugs into a standard oscilloscope, an rf power amplifier, a standard decade oscillator, and a set of broadband transducers for typical use at 1, 2, 5 and 10 MHz. The system provides for its own calibration, and on the oscilloscope, presents a quantitative (digital) indication of time base and sensitivity scale factors and some measurement data.

  15. Quantitative Finance CONTACT INFORMATION

    E-print Network

    Arnold, Elizabeth A.

    Quantitative Finance CONTACT INFORMATION Pamela Peterson Drake, PhD Department Head, Finance of Finance College of Business ZSH 325 (540) 568-8107 finkjd@jmu.edu Michelle Duncan, Advisor Academic Services Center College of Business ZSH 205 (540) 568-3078 duncanml@jmu.edu What is a quantitative finance

  16. NATIONAL MORTALITY FOLLOWBACK SURVEY (NMFS)

    EPA Science Inventory

    The 1993 National Mortality Followback Survey (NMFS) is the latest in a series of periodic surveys designed to supplement information routinely collected on the death certificate. The Mortality Followback Survey Program, begun in the 1960's by the National Center for Health Stati...

  17. National Ambulatory Medical Care Survey

    Cancer.gov

    The National Ambulatory Medical Care Survey (NAMCS) is a national survey designed to collect information about ambulatory medical care services in the United States. Patient data is collected from physicians primarily engaged in direct patient care, excluding those in the specialties of anesthesiology, pathology, and radiology. The survey was conducted annually from 1973 to 1981, in 1985, and annually since 1989.

  18. A Survey of Systemic Risk Analytics

    E-print Network

    Bisias, Dimitrios

    We provide a survey of 31 quantitative measures of systemic risk in the economics and finance literature, chosen to span key themes and issues in systemic risk measurement and management. We motivate these measures from ...

  19. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and

    E-print Network

    Born, Richard

    biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoningEDUCATION The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills

  20. MARSAME Implement The Survey Design 5 IMPLEMENT THE SURVEY DESIGN

    E-print Network

    discusses the implementation phase of the data life cycle and focuses on controlling measurement uncertainty the implementation phase of the data life cycle. Similar to MARSSIM, MARSAME excludes specific recommendations hazards (e.g., confined spaces, unstable surfaces, heat and cold stress) and towards scenarios where

  1. On Quantitative Rorschach Scales.

    ERIC Educational Resources Information Center

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  2. Quantitative film radiography

    SciTech Connect

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-02-26

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects.

  3. The Survey Questionnaire

    ERIC Educational Resources Information Center

    Ritter, Lois A. Ed.; Sue, Valerie M., Ed.

    2007-01-01

    Internet-based surveys are still relatively new, and researchers are just beginning to articulate best practices for questionnaire design. Online questionnaire design has generally been guided by the principles applying to other self-administered instruments, such as paper-based questionnaires. Web-based questionnaires, however, have the potential…

  4. Quantitative receptor autoradiography

    SciTech Connect

    Boast, C.A.; Snowhill, E.W.; Altar, C.A.

    1986-01-01

    Quantitative receptor autoradiography addresses the topic of technical and scientific advances in the sphere of quantitative autoradiography. The volume opens with a overview of the field from a historical and critical perspective. Following is a detailed discussion of in vitro data obtained from a variety of neurotransmitter systems. The next section explores applications of autoradiography, and the final two chapters consider experimental models. Methodological considerations are emphasized, including the use of computers for image analysis.

  5. Quantitative enantioselective Raman spectroscopy.

    PubMed

    Kiefer, J

    2015-08-01

    Analytical methods for quantitative enantioselective measurements are highly desirable in the life sciences. Existing technologies have disadvantages such as limited temporal resolution, the need for molecular labeling, or high experimental complexity. To overcome these limitations, this work presents a method based on conventional Raman spectroscopy. A systematic investigation of the key parameters is carried out. It is demonstrated that their careful choice provides an opportunity for enantioselective and quantitative analysis of enantiopure systems as well as enantiomer mixtures. PMID:26066374

  6. Accounting for Imperfect Detection in Ecology: A Quantitative Review

    PubMed Central

    Kellner, Kenneth F.; Swihart, Robert K.

    2014-01-01

    Detection in studies of species abundance and distribution is often imperfect. Assuming perfect detection introduces bias into estimation that can weaken inference upon which understanding and policy are based. Despite availability of numerous methods designed to address this assumption, many refereed papers in ecology fail to account for non-detection error. We conducted a quantitative literature review of 537 ecological articles to measure the degree to which studies of different taxa, at various scales, and over time have accounted for imperfect detection. Overall, just 23% of articles accounted for imperfect detection. The probability that an article incorporated imperfect detection increased with time and varied among taxa studied; studies of vertebrates were more likely to incorporate imperfect detection. Among articles that reported detection probability, 70% contained per-survey estimates of detection that were less than 0.5. For articles in which constancy of detection was tested, 86% reported significant variation. We hope that our findings prompt more ecologists to consider carefully the detection process when designing studies and analyzing results, especially for sub-disciplines where incorporation of imperfect detection in study design and analysis so far has been lacking. PMID:25356904

  7. Tutorial on technology transfer and survey design and data collection for measuring Internet and Intranet existence, usage, and impact (survey-2000) in acute care hospitals in the United States.

    PubMed

    Hatcher, M

    2001-02-01

    This paper provides a tutorial of technology transfer for management information systems in health care. Additionally it describes the process for a national survey of acute care hospitals using a random sample of 813 hospitals. The purpose of the survey was to measure the levels of Internet and Intranet existence and usage in acute care hospitals. The depth of the survey includes e-commerce for both business to business and with customers. The relationships with systems approaches, user involvement, user satisfaction and decision-making will be studied. Changes with results of a prior survey conducted in 1997 can be studied and enabling and inhabiting factors identified. This information will provide benchmarks for hospitals to plan their network technology position and to set goals. PMID:11288480

  8. A Very High Resolution, Deep-Towed Multichannel Seismic Survey in the Yaquina Basin off Peru ? Technical Design of the new Deep-Tow Streamer

    NASA Astrophysics Data System (ADS)

    Bialas, J.; Breitzke, M.

    2002-12-01

    Within the project INGGAS a new deep towed acoustic profiling instrument consisting of a side scan sonar fish and a 26 channel seismic streamer has been developed for operation in full ocean depth. The digital channels are build by single hydrophones and three engineering nodes (EN) which are connected either by 1 m or 6.5 m long cable segments. Together with high frequent surface sources (e.g. GI gun) this hybrid system allows to complete surveys with target resolutions of higher frequency content than from complete surface based configurations. Consequently special effort has been addressed to positioning information of the submerged towed instrument. Ultra Short Base Line (USBL) navigation of the tow fish allows precise coordinate evaluation even with more than 7 km of tow cable. Specially designed engineering nodes comprise a single hydrophone with compass, depth, pitch and roll sensors. Optional extension of the streamer up to 96 hydrophone nodes and 75 engineering nodes is possible. A telemetry device allows up- and downlink transmission of all system parameters and all recorded data from the tow fish in real time. Signals from the streamer and the various side scan sensors are multiplexed along the deep-sea cable. Within the telemetry system coaxial and fiber optic connectors are available and can be chosen according to the ships needs. In case of small bandwidth only selected portions of data are transmitted onboard to provide full online quality control while a copy of the complete data set is stored within the submerged systems. Onboard the record strings of side scan and streamer are demultiplexed and distributed to the quality control (QC) systems by Ethernet. A standard marine multichannel control system is used to display shot gather, spectra and noise monitoring of the streamer channels as well as data storage in SEG format. Precise navigation post processing includes all available positioning information from the vessel (DGPS), the USBL, the streamer (EN) and optional first break information. Therefore exact positioning of each hydrophone can be provided throughout the entire survey which is an essential input for later migration processing of the seismic data.

  9. PseudoEmpirical Likelihood Inference for Multiple Frame Surveys

    E-print Network

    Wu, Changbao

    effect; Dual-frame surveys; Multiplicity; Survey design; Unequal probability sampling. 1. INTRODUCTION sampling frame. The Canadian Commu- nity Health Survey (CCHS), a cross-sectional survey that col- lects designed for the Canadian Labour Force Survey (LFS). A Random Digit Dialing sampling frame and a list frame

  10. Fostering the Development of Quantitative Life Skills through Introductory Astronomy: Can it be Done?

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, D. W.

    2012-01-01

    We present preliminary results from a student survey designed to test whether the all-important life skill of numeracy/quantitative literacy can be fostered and improved upon in college students through the vehicle of non-major introductory courses in Astronomy. Many instructors of introductory science courses for non-majors would state that a major goal of our classes is to teach our students to distinguish between science and pseudoscience, truth and fiction, in their everyday lives. It is difficult to believe that such a skill can truly be mastered without a fair amount of mathematical sophistication in the form of arithmetic, statistical and graph reading skills that many American college students unfortunately lack when they enter our classrooms. In teaching what is frequently their "terminal science course in life” can we instill in our students the numerical skills that they need to be savvy consumers, educated citizens and discerning interpreters of the ever-present polls, studies and surveys in which our society is awash? In what may well be their final opportunity to see applied mathematics in the classroom, can we impress upon them the importance of mathematical sophistication in interpreting the statistics that they are bombarded with by the media? Our study is in its second semester, and is designed to investigate to what extent it is possible to improve important quantitative skills in college students through a single semester introductory Astronomy course.

  11. 3D-quantitative structure-activity relationships of human immunodeficiency virus type-1 proteinase inhibitors: comparative molecular field analysis of 2-heterosubstituted statine derivatives-implications for the design of novel inhibitors.

    PubMed

    Kroemer, R T; Ettmayer, P; Hecht, P

    1995-12-01

    A set of 100 novel 2-heterosubstituted statine derivatives inhibiting human immunodeficiency virus type-1 proteinase has been investigated by comparative molecular field analysis. In order to combine the structural information available from X-ray analyses with a predictive quantitative structure-activity relationship (QSAR) model, docking experiments of a prototype compound into the receptor were performed, and the 'active conformation' was determined. The structure of the receptor was taken from the published X-ray analysis of the proteinase with bound MVT-101, the latter compound exhibiting high structural similarity with the inhibitors investigated. The validity of the resulting QSARs was confirmed in four different ways. (1) The common parameters, namely, the cross-validated r2 values obtained by the leave-one-out (LOO) method (r2cv = 0.572-0.593), and (2) the accurate prediction of a test set of 67 compounds (q2 = 0.552-0.569) indicated a high consistency of the models. (3) Repeated analyses with two randomly selected cross-validation groups were performed and the cross-validated r2 values monitored. The resulting average r2 values were of similar magnitudes compared to those obtained by the LOO method. (4) The coefficient fields were compared with the steric and electrostatic properties of the receptor and showed a high level of compatibility. Further analysis of the results led to the design of a novel class of highly active compounds containing an additional linkage between P1' and P3'. The predicted activities of these inhibitors were also in good agreement with the experimentally determined values. PMID:8523405

  12. Charlotte Danielson's Theory of Teacher Evaluations: A Quantitative Study of Teachers' Perceptions on the Four Domains

    ERIC Educational Resources Information Center

    Doerr, Scott E.

    2012-01-01

    This quantitative study determined teachers' perceptions on the four domains of Danielson's framework for teaching. The study surveyed teachers from five school districts regarding the components set forth in Danielson's model for evaluating teachers. The survey was created by Sweeley (2004) and her dissertation chair, Dr. Brogan. The survey

  13. Extragalactic HI Surveys

    E-print Network

    Giovanelli, Riccardo

    2015-01-01

    We review the results of HI line surveys of extragalactic sources in the local Universe. In the last two decades major efforts have been made in establishing on firm statistical grounds the properties of the HI source population, the two most prominent being the HI Parkes All Sky Survey (HIPASS) and the Arecibo Legacy Fast ALFA survey (ALFALFA). We review the choices of technical parameters in the design and optimization of spectro-photometric "blind" HI surveys, which for the first time produced extensive HI-selected data sets. Particular attention is given to the relationship between optical and HI populations, the differences in their clustering properties and the importance of HI-selected samples in contributing to the understanding of apparent conflicts between observation and theory on the abundance of low mass halos. The last section of this paper provides an overview of currently ongoing and planned surveys which will explore the cosmic evolution of properties of the HI population.

  14. Design, Data Collection, Monitoring, Interview Administration Time, and Data Editing in the 1993 National Household Education Survey (NHES:93). Working Paper Series.

    ERIC Educational Resources Information Center

    Brick, J. Michael; Collins, Mary A.; Nolin, Mary Jo; Davies, Elizabeth; Feibus, Mary L.

    The National Household Education Survey (NHES) is a data collection system of the National Center for Education Statistics that collects and publishes data on the condition of education in the United States. It is a telephone survey of the noninstitutionalized population of the country, and it focuses on issues that are best studied through…

  15. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  16. Qualitative and Quantitative Models of Speech Translation

    E-print Network

    heads of phrases. Statistical parame- ters for structural dependency, lexical transfer, and lin- ear a qualitative reasoning model of translation with a quantitative statistical model. We consider these models within the context of two hy- pothetical speech translation systems, starting with a logic-based design

  17. Towards global benchmarking of food environments and policies to reduce obesity and diet-related non-communicable diseases: design and methods for nation-wide surveys

    PubMed Central

    Vandevijvere, Stefanie; Swinburn, Boyd

    2014-01-01

    Introduction Unhealthy diets are heavily driven by unhealthy food environments. The International Network for Food and Obesity/non-communicable diseases (NCDs) Research, Monitoring and Action Support (INFORMAS) has been established to reduce obesity, NCDs and their related inequalities globally. This paper describes the design and methods of the first-ever, comprehensive national survey on the healthiness of food environments and the public and private sector policies influencing them, as a first step towards global monitoring of food environments and policies. Methods and analysis A package of 11 substudies has been identified: (1) food composition, labelling and promotion on food packages; (2) food prices, shelf space and placement of foods in different outlets (mainly supermarkets); (3) food provision in schools/early childhood education (ECE) services and outdoor food promotion around schools/ECE services; (4) density of and proximity to food outlets in communities; food promotion to children via (5) television, (6) magazines, (7) sport club sponsorships, and (8) internet and social media; (9) analysis of the impact of trade and investment agreements on food environments; (10) government policies and actions; and (11) private sector actions and practices. For the substudies on food prices, provision, promotion and retail, ‘environmental equity’ indicators have been developed to check progress towards reducing diet-related health inequalities. Indicators for these modules will be assessed by tertiles of area deprivation index or school deciles. International ‘best practice benchmarks’ will be identified, against which to compare progress of countries on improving the healthiness of their food environments and policies. Dissemination This research is highly original due to the very ‘upstream’ approach being taken and its direct policy relevance. The detailed protocols will be offered to and adapted for countries of varying size and income in order to establish INFORMAS globally as a new monitoring initiative to reduce obesity and diet-related NCDs. PMID:24833697

  18. Very large radio surveys of the sky.

    PubMed

    Condon, J J

    1999-04-27

    Recent advances in electronics and computing have made possible a new generation of large radio surveys of the sky that yield an order-of-magnitude higher sensitivity and positional accuracy. Combined with the unique properties of the radio universe, these quantitative improvements open up qualitatively different and exciting new scientific applications of radio surveys. PMID:10220365

  19. KE Basin underwater visual fuel survey

    SciTech Connect

    Pitner, A.L.

    1995-02-01

    Results of an underwater video fuel survey in KE Basin using a high resolution camera system are presented. Quantitative and qualitative information on fuel degradation are given, and estimates of the total fraction of ruptured fuel elements are provided. Representative photographic illustrations showing the range of fuel conditions observed in the survey are included.

  20. Very large radio surveys of the sky

    PubMed Central

    Condon, J. J.

    1999-01-01

    Recent advances in electronics and computing have made possible a new generation of large radio surveys of the sky that yield an order-of-magnitude higher sensitivity and positional accuracy. Combined with the unique properties of the radio universe, these quantitative improvements open up qualitatively different and exciting new scientific applications of radio surveys. PMID:10220365

  1. 78 FR 64911 - 2013 Company Organization Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... Bureau of the Census 2013 Company Organization Survey AGENCY: Bureau of the Census, Commerce. ACTION... Organization Survey. The survey's data are needed, in part, to update the multilocation companies in the Business Register. The survey, which has been conducted annually since 1974, is designed to...

  2. 75 FR 71417 - 2010 Company Organization Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-23

    ... of the Census 2010 Company Organization Survey AGENCY: Bureau of the Census, Commerce. ACTION: Notice... Organization Survey. The survey's data are needed, in part, to update the multilocation companies in the Business Register. The survey, which has been conducted annually since 1974, is designed to...

  3. 76 FR 62759 - 2011 Company Organization Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-11

    ... of the Census 2011 Company Organization Survey AGENCY: Bureau of the Census, Commerce. ACTION: Notice... Organization Survey. The survey's data are needed, in part, to update the multilocation companies in the Business Register. The survey, which has been conducted annually since 1974, is designed to...

  4. Montana State University 1 Land Surveying Minor

    E-print Network

    Maxwell, Bruce D.

    Montana State University 1 Land Surveying Minor This minor is designed to provide students with perspective and skills to pursue a successful career in surveying or a surveying related field. The focus is on courses related to surveying such as photogrammetry, global positioning systems

  5. Terminating Sequential Delphi Survey Data Collection

    ERIC Educational Resources Information Center

    Kalaian, Sema A.; Kasim, Rafa M.

    2012-01-01

    The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey

  6. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works. Engineers need more quantitative information. In order to apply geophysical methods to engineering design works, quantitative interpretation is very important. The presentation introduces several case studies from different countries around the world (Fig. 2) from the integrated and quantitative points of view.

  7. Quantitative photoacoustic tomography

    PubMed Central

    Yuan, Zhen; Jiang, Huabei

    2009-01-01

    In this paper, several algorithms that allow for quantitative photoacoustic reconstruction of tissue optical, acoustic and physiological properties are described in a finite-element method based framework. These quantitative reconstruction algorithms are compared, and the merits and limitations associated with these methods are discussed. In addition, a multispectral approach is presented for concurrent reconstructions of multiple parameters including deoxyhaemoglobin, oxyhaemoglobin and water concentrations as well as acoustic speed. Simulation and in vivo experiments are used to demonstrate the effectiveness of the reconstruction algorithms presented. PMID:19581254

  8. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  9. Slone Survey

    Cancer.gov

    The Slone Survey, run by the Slone Epidemiology Center at Boston University, is an ongoing telephone survey of medication use in the U.S. population. The survey began in 1998 and to date over 19,500 subjects have been interviewed.

  10. Quantitative Decision Making.

    ERIC Educational Resources Information Center

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  11. Leadership In Quantitative Excellence

    E-print Network

    Frey, Robert J.

    Darnell, Managing Director, GMO Eric Wepsic, Managing Director, DE Shaw #12;Presents Leadership for the development, trading, risk management, and marketing of quantitative long-short, long-only, and overlay focus is the process by which capital markets incorporate information about future cash flows and risks

  12. Quantitative Simulation Games

    NASA Astrophysics Data System (ADS)

    ?erný, Pavol; Henzinger, Thomas A.; Radhakrishna, Arjun

    While a boolean notion of correctness is given by a preorder on systems and properties, a quantitative notion of correctness is defined by a distance function on systems and properties, where the distance between a system and a property provides a measure of "fit" or "desirability." In this article, we explore several ways how the simulation preorder can be generalized to a distance function. This is done by equipping the classical simulation game between a system and a property with quantitative objectives. In particular, for systems that satisfy a property, a quantitative simulation game can measure the "robustness" of the satisfaction, that is, how much the system can deviate from its nominal behavior while still satisfying the property. For systems that violate a property, a quantitative simulation game can measure the "seriousness" of the violation, that is, how much the property has to be modified so that it is satisfied by the system. These distances can be computed in polynomial time, since the computation reduces to the value problem in limit average games with constant weights. Finally, we demonstrate how the robustness distance can be used to measure how many transmission errors are tolerated by error correcting codes.

  13. Quality Technology & Quantitative Management

    E-print Network

    Nguyen, Nam-Ky

    for the construction of D-optimal designs are given. A faster implementation of the FEA is presented, which is referred to as fast-FEA (denoted FFEA). The FFEA was applied to construct D-optimal designs for several published measure of design goodness). The most popular design criterion is D-optimality, which involves selecting

  14. Survey design and model appraisal based on resolution analysis for 4D gravity monitoring Kristofer Davis, M. Andy Kass, Rich A. Krahenbuhl, and Yaoguo Li

    E-print Network

    , there has been an emergence of 4D gravity surveys, particularly for reservoir monitoring. In such cases the reservoir?' We look at the theory behind this question and where the standard assumptions are no longer

  15. Quantitative Articles: Developing Studies for Publication in Counseling Journals

    ERIC Educational Resources Information Center

    Trusty, Jerry

    2011-01-01

    This article is presented as a guide for developing quantitative studies and preparing quantitative manuscripts for publication in counseling journals. It is intended as an aid for aspiring authors in conceptualizing studies and formulating valid research designs. Material is presented on choosing variables and measures and on selecting…

  16. Quantitative evaluation of self-checking circuits

    NASA Astrophysics Data System (ADS)

    Lu, D. J.; McCluskey, E. J.

    1984-04-01

    Quantitative measures of self-checking power are defined for evaluation, comparison, and design of self-checking circuits. The self-testing and fault-secure properties have the corresponding quantitative measures testing input fraction (TIF), and secure input fraction (SIF). Averaging these measures over the fault set yields basic figures of merit. These simple averages can conceal faults with low values of TIF or SIF. Improved figures of merit, based on geometric means, are defined to provide greater sensitivity to low TIF or SIF. As a demonstration, self-checking linear feedback shift registers (LFSR's) based on duplication and serial parity prediction are evaluated.

  17. The Wide Field Imager Lyman-Alpha Search (WFILAS) for Galaxies at Redshift ~5.7: II. Survey Design and Sample Analysis

    E-print Network

    E. Westra; D. Heath Jones; C. E. Lidman; K. Meisenheimer; R. M. Athreya; C. Wolf; T. Szeifert; E. Pompei; L. Vanzi

    2006-08-02

    Context: Wide-field narrowband surveys are an efficient way of searching large volumes of high-redshift space for distant galaxies. Aims: We describe the Wide Field Imager Lyman-Alpha Search (WFILAS) over 0.74 sq. degree for bright emission-line galaxies at z~5.7. Methods: WFILAS uses deep images taken with the Wide Field Imager (WFI) on the ESO/MPI 2.2m telescope in three narrowband (70 A), one encompassing intermediate band (220 A) and two broadband filters, B and R. We use the novel technique of an encompassing intermediate band filter to exclude false detections. Images taken with broadband B and R filters are used to remove low redshift galaxies from our sample. Results: We present a sample of seven Lya emitting galaxy candidates, two of which are spectroscopically confirmed. Compared to other surveys all our candidates are bright, the results of this survey complements other narrowband surveys at this redshift. Most of our candidates are in the regime of bright luminosities, beyond the reach of less voluminous surveys. Adding our candidates to those of another survey increases the derived luminosity density by ~30%. We also find potential clustering in the Chandra Deep Field South, supporting overdensities discovered by other surveys. Based on a FORS2/VLT spectrum we additionally present the analysis of the second confirmed Lya emitting galaxy in our sample. We find that it is the brightest Lya emitting galaxy (1 x 10^-16 erg s^-1 cm^-2) at this redshift to date and the second confirmed candidate of our survey. Both objects exhibit the presence of a possible second Lya component redward of the line.

  18. Theory Survey or Survey Theory?

    ERIC Educational Resources Information Center

    Dean, Jodi

    2010-01-01

    Matthew Moore's survey of political theorists in U.S. American colleges and universities is an impressive contribution to political science (Moore 2010). It is the first such survey of political theory as a subfield, the response rate is very high, and the answers to the survey questions provide new information about how political theorists look…

  19. HEDGEROW SURVEY, GREAT CRESTED NEWT SURVEY, DORMOUSE SURVEY AND HORSESHOE BAT ACTIVITY SURVEYS AT UNIVERSITY OF

    E-print Network

    Burton, Geoffrey R.

    HEDGEROW SURVEY, GREAT CRESTED NEWT SURVEY, DORMOUSE SURVEY AND HORSESHOE BAT ACTIVITY SURVEYS-UNIBAT-1624 HEDGEROW SURVEY, GREAT CRESTED NEWT SURVEY, DORMOUSE SURVEY AND HORSESHOE BAT ACTIVITY SURVEYS to undertake a hedgerow survey, a great crested newt survey, a dormouse survey and horseshoe bat activity

  20. QUANTITY: An Isobaric Tag for Quantitative Glycomics

    PubMed Central

    Yang, Shuang; Wang, Meiyao; Chen, Lijun; Yin, Bojiao; Song, Guoqiang; Turko, Illarion V.; Phinney, Karen W.; Betenbaugh, Michael J.; Zhang, Hui; Li, Shuwei

    2015-01-01

    Glycan is an important class of macromolecules that play numerous biological functions. Quantitative glycomics - analysis of glycans at global level - however, is far behind genomics and proteomics owing to technical challenges associated with their chemical properties and structural complexity. As a result, technologies that can facilitate global glycan analysis are highly sought after. Here, we present QUANTITY (Quaternary Amine Containing Isobaric Tag for Glycan), a quantitative approach that can not only enhance detection of glycans by mass spectrometry, but also allow high-throughput glycomic analysis from multiple biological samples. This robust tool enabled us to accomplish glycomic survey of bioengineered Chinese Hamster Ovary (CHO) cells with knock-in/out enzymes involved in protein glycosylation. Our results demonstrated QUANTITY is an invaluable technique for glycan analysis and bioengineering. PMID:26616285

  1. Quantitative microdialysis under transient conditions.

    PubMed

    Olson, R J; Justice, J B

    1993-04-15

    A microdialysis method is described which allows for the quantitative determination of extracellular analyte concentration under transient conditions. The method provides the extracellular concentration and the in vivo probe recovery as a function of time. The technique is based on the method for steady-state conditions but differs in the use of a between-group rather than a within-group design. Following cocaine and amphetamine administration, a significantly greater increase in extracellular DA was found than was estimated from the dialysate using conventional microdialysis methods. The discrepancy is due to microdialysis probe recovery decreasing concurrently with the increase observed in extracellular DA following cocaine and amphetamine injections. The method estimates the extracellular concentration independently of any changes in recovery. PMID:8494171

  2. A Survey of Health Care Models that Encompass Multiple Departments

    E-print Network

    Boucherie, Richard J.

    A Survey of Health Care Models that Encompass Multiple Departments Peter T. Vanberkel1 , Richard J 489 5480, Fax: +31 53 489 2159 #12;A Survey of Health Care Models that Encompass Multiple Departments Abstract In this survey we review quantitative health care models to illustrate the extent to which

  3. Applying quantitative models to evaluate complexity in video game systems

    E-print Network

    Tanwanteng, Matthew (Matthew E.)

    2009-01-01

    This thesis proposes a games evaluation model that reports significant statistics about the complexity of a game's various systems. Quantitative complexity measurements allow designers to make accurate decisions about how ...

  4. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  5. More details...
  6. Quantitative sensory testing.

    PubMed

    Soomekh, David

    2006-07-01

    The diagnosis and treatment of peripheral neuropathy from any cause has come to the forefront of the research community in the past few years. Both past and new diagnostic and treatment options have been and are being studied to better understand and properly treat this debilitating and sometimes devastating disease. One such advancement is the clinical use of quantitative sensory testing. To identify etiology of the neuropathy early, the testing instrument would need to identify changes throughout the course of the disease, have a normative database, and show a clear distinction between the absence or presence of disease. The pressure specified sensory device (PSSD) was developed in 1992 to painlessly investigate the cutaneous pressure thresholds quantitatively and accurately. PMID:16958387

  7. Quantitative criteria for insomnia.

    PubMed

    Lichstein, K L; Durrence, H H; Taylor, D J; Bush, A J; Riedel, B W

    2003-04-01

    Formal diagnostic systems (DSM-IV, ICSD, and ICD-10) do not provide adequate quantitative criteria to diagnose insomnia. This may not present a serious problem in clinical settings where extensive interviews determine the need for clinical management. However, lack of standard criteria introduce disruptive variability into the insomnia research domain. The present study reviewed two decades of psychology clinical trials for insomnia to determine common practice with regard to frequency, severity, and duration criteria for insomnia. Modal patterns established frequency (> or =3 nights a week) and duration (> or =6 months) standard criteria. We then applied four versions of severity criteria to a random sample and used sensitivity-specificity analyses to identify the most valid criterion. We found that severity of sleep onset latency or wake time after sleep onset of: (a) > or =31 min; (b) occurring > or =3 nights a week; (c) for > or =6 months are the most defensible quantitative criteria for insomnia. PMID:12643966

  8. Energy & Climate: Getting Quantitative

    NASA Astrophysics Data System (ADS)

    Wolfson, Richard

    2011-11-01

    A noted environmentalist claims that buying an SUV instead of a regular car is energetically equivalent to leaving your refrigerator door open for seven years. A fossil-fuel apologist argues that solar energy is a pie-in-the-sky dream promulgated by na"ive environmentalists, because there's nowhere near enough solar energy to meet humankind's energy demand. A group advocating shutdown of the Vermont Yankee nuclear plant claims that 70% of its electrical energy is lost in transmission lines. Around the world, thousands agitate for climate action, under the numerical banner ``350.'' Neither the environmentalist, the fossil-fuel apologist, the antinuclear activists, nor most of those marching under the ``350'' banner can back up their assertions with quantitative arguments. Yet questions about energy and its environmental impacts almost always require quantitative answers. Physics can help! This poster gives some cogent examples, based on the newly published 2^nd edition of the author's textbook Energy, Environment, and Climate.

  9. Primary enzyme quantitation

    DOEpatents

    Saunders, G.C.

    1982-03-04

    The disclosure relates to the quantitation of a primary enzyme concentration by utilizing a substrate for the primary enzyme labeled with a second enzyme which is an indicator enzyme. Enzyme catalysis of the substrate occurs and results in release of the indicator enzyme in an amount directly proportional to the amount of primary enzyme present. By quantifying the free indicator enzyme one determines the amount of primary enzyme present.

  10. Exercising QS: Quantitative Skills in an Exercise Science Course

    ERIC Educational Resources Information Center

    Wilson, T. M.

    2013-01-01

    This study seeks to bring the discipline of exercise science into the discussion of Quantitative Skills (QS) in science. The author's experiences of providing learning support to students and working with educators in the field are described, demonstrating the difficulty of encouraging students to address their skills deficit. A survey of…

  11. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  12. ORIGINAL PAPER A Quantitative Perspective on Ethics in Large Team

    E-print Network

    ethics education and science policy. Keywords Team science Á Team science ethics Á Team scienceORIGINAL PAPER A Quantitative Perspective on Ethics in Large Team Science Alexander M. Petersen. To this end, our expository analysis provides a survey of ethical issues in team settings to inform science

  13. Chapter B in Geological Survey research 1966

    USGS Publications Warehouse

    U.S. Geological Survey

    1966-01-01

    This collection of 43 short papers is the first published chapter of 'Geological Survey Research 1966.' The papers report on scientific and economic results of current work by members of the Conservation, Geologic, Topographic, and Water Resources Divisions of the U.S. Geological Survey. Chapter A, to be published later in the year, will present a summary of significant results of work done during fiscal year 1966, together with lists of investigations in progress, reports published, cooperating agencies, and Geological Survey offices. 'Geological Survey Research 1966' is the seventh volume of the annual series Geological Survey Research. The six volumes already published are listed below, with their series designations. Geological Survey Research 1960-Prof. Paper 400 Geological Survey Research 1961-Prof. Paper 424 Geological Survey Research 1962-Prof. Paper 450 Geological Survey Research 1963-Prof. Paper 475 Geological Survey Research 1964-Prof. Paper 501 Geological Survey Research 1965-Prof. Paper 525

  14. Measuring teamwork in primary care: Triangulation of qualitative and quantitative data.

    PubMed

    Brown, Judith Belle; Ryan, Bridget L; Thorpe, Cathy; Markle, Emma K R; Hutchison, Brian; Glazier, Richard H

    2015-09-01

    This article describes the triangulation of qualitative dimensions, reflecting high functioning teams, with the results of standardized teamwork measures. The study used a mixed methods design using qualitative and quantitative approaches to assess teamwork in 19 Family Health Teams in Ontario, Canada. This article describes dimensions from the qualitative phase using grounded theory to explore the issues and challenges to teamwork. Two quantitative measures were used in the study, the Team Climate Inventory (TCI) and the Providing Effective Resources and Knowledge (PERK) scale. For the triangulation analysis, the mean scores of these measures were compared with the qualitatively derived ratings for the dimensions. The final sample for the qualitative component was 107 participants. The qualitative analysis identified 9 dimensions related to high team functioning such as common philosophy, scope of practice, conflict resolution, change management, leadership, and team evolution. From these dimensions, teams were categorized numerically as high, moderate, or low functioning. Three hundred seventeen team members completed the survey measures. Mean site scores for the TCI and PERK were 3.87 and 3.88, respectively (of 5). The TCI was associated will all dimensions except for team location, space allocation, and executive director leadership. The PERK was associated with all dimensions except team location. Data triangulation provided qualitative and quantitative evidence of what constitutes teamwork. Leadership was pivotal in forging a common philosophy and encouraging team collaboration. Teams used conflict resolution strategies and adapted to the changes they encountered. These dimensions advanced the team's evolution toward a high functioning team. PMID:25730503

  15. A Quantitative Comparison of Leading-edge Vortices in Incompressible and Supersonic Flows

    NASA Technical Reports Server (NTRS)

    Wang, F. Y.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2002-01-01

    When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that plague measurement techniques in high-speed flows. In the present paper an attempt is made to examine this practice by comparing quantitative data on the nearwake properties of such vortices in incompressible and supersonic flows. The incompressible flow data are obtained in experiments conducted in a low-speed wind tunnel. Detailed flow-field properties, including vorticity and turbulence characteristics, obtained by hot-wire and pressure probe surveys are documented. These data are compared, wherever possible, with available data from a past work for a Mach 2.49 flow for the same wing geometry and angles-of-attack. The results indicate that quantitative similarities exist in the distributions of total pressure and swirl velocity. However, the streamwise velocity of the core exhibits different trends. The axial flow characteristics of the vortices in the two regimes are examined, and a candidate theory is discussed.

  16. Survey of Solar Buildings.

    ERIC Educational Resources Information Center

    Gray, Robert; Baker, Steven

    This survey brings together information concerning the growing number of buildings utilizing solar energy and is designed to facilitate the comparison of specific characteristics of the buildings. The 66 U.S. entries are divided into five regions, arranged by state, and roughly by date within each state. Seven entries are from other countries. A…

  17. A Survey Transition Course

    ERIC Educational Resources Information Center

    Johnston, William; McAllister, Alex M.

    2012-01-01

    Successful outcomes for a "Transition Course in Mathematics" have resulted from two unique design features. The first is to run the course as a "survey course" in mathematics, introducing sophomore-level students to a broad set of mathematical fields. In this single mathematics course, undergraduates benefit from an introduction of proof…

  18. MARYLAND BIOLOGICAL STREAM SURVEY

    EPA Science Inventory

    The Maryland Biological Stream Survey (MBSS) is a multi-year probability-based sampling program designed to assess the status of biological resources in non-tidal streams of Maryland. The MBSS is quantifying the extent to which acidic deposition and other human activities have af...

  19. NATIONAL ALCOHOL SURVEY (NAS)

    EPA Science Inventory

    National Alcohol Survey (NAS) is designed to assess the trends in drinking practices and problems in the national population, including attitudes, norms, treatment and experiences and adverse consequences. It also studies the effects of public policy on drinking practices (i.e., ...

  20. NATIONAL HEALTH SURVEY OF BEACHES

    EPA Science Inventory

    Resource Purpose:The annual Beach Survey is designed to gather information about beach water quality, standards, monitoring, and beach health advisories or closures issued during the previous year's bathing season. Each year the survey updates previously submitted beach i...

  21. Regional High School Senior Survey.

    ERIC Educational Resources Information Center

    Day, Philip R., Jr.

    In order to identify the educational needs and aspirations of graduating high school seniors in the service region of the University of Maine at Augusta, a survey instrument was designed and administered to 1,950 seniors at 19 institutions. In all, 1,744 completed surveys were returned, a 92 percent response rate. The data are sub-grouped into…

  1. Assimilation and Language. Survey Brief

    ERIC Educational Resources Information Center

    Pew Hispanic Center, 2004

    2004-01-01

    The Pew Hispanic Center/Kaiser Family Foundation 2002 National Survey of Latinos explored the attitudes and experiences of Latinos on a wide variety of topics. The survey sample was designed to include enough Hispanics from various backgrounds and national origin groups so that in addition to describing Latinos overall, comparisons also could be…

  2. THE PAST, PRESENT, AND FUTURE OF THE REGIONAL ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM, PART OF SESSION ON R-EMAP: THE APPLICATION OF EMAP INDICATORS AND DESIGNS TO REGIONAL AND STATE MONITORING PROBLEMS

    EPA Science Inventory

    Over the past 10 years REMAP has taken the EMAP's probability survey design and multiple indicator approach for monitoring the nation's aquatic resources and applied them to regional and state-level monitoring problems. The strength of the EMAP approach lies in its quantitative i...

  3. 75 FR 55585 - Proposed Collection; Comment Request; Generic Clearance for Surveys of Customers and Partners of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-13

    ...is as follows: Quantitative surveys: Estimated Number of Respondents per Survey: 9,820; Estimated Number of Responses per Respondent: 1...Average Burden Hours per Response: 0.25; Estimated Total Annual Burden Hours Requested...

  4. 75 FR 76993 - Submission for OMB Review; Comment Request; Generic Clearance for Surveys of Customers and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-10

    ...is as follows: Quantitative surveys: Estimated Number of Respondents per Survey: 9,820; Estimated Number of Responses per Respondent: 1...Average Burden Hours per Response: 0.25; Estimated Total Annual Burden Hours Requested...

  5. Quantitative ultrasound molecular imaging.

    PubMed

    Yeh, James Shue-Min; Sennoga, Charles A; McConnell, Ellen; Eckersley, Robert; Tang, Meng-Xing; Nourshargh, Sussan; Seddon, John M; Haskard, Dorian O; Nihoyannopoulos, Petros

    2015-09-01

    Ultrasound molecular imaging using targeting microbubbles is predominantly a semi-quantitative tool, thus limiting its potential diagnostic power and clinical applications. In the work described here, we developed a novel method for acoustic quantification of molecular expression. E-Selectin expression in the mouse heart was induced by lipopolysaccharide. Real-time ultrasound imaging of E-selectin expression in the heart was performed using E-selectin-targeting microbubbles and a clinical ultrasound scanner in contrast pulse sequencing mode at 14 MHz, with a mechanical index of 0.22-0.26. The level of E-selectin expression was quantified using a novel time-signal intensity curve analytical method based on bubble elimination, which consisted of curve-fitting the bi-exponential equation [Formula: see text] to the elimination phase of the myocardial time-signal intensity curve. Ar and Af represent the maximum signal intensities of the retained and freely circulating bubbles in the myocardium, respectively; ?r and ?f represent the elimination rate constants of the retained and freely circulating bubbles in the myocardium, respectively. Ar correlated strongly with the level of E-selectin expression (|r|>0.8), determined using reverse transcriptase real-time quantitative polymerase chain reaction, and the duration of post-lipopolysaccharide treatment-both linearly related to cell surface E-selectin protein (actual bubble target) concentration in the expression range imaged. Compared with a conventional acoustic quantification method (which used retained bubble signal intensity at 20 min post-bubble injection), this new approach exhibited greater dynamic range and sensitivity and was able to simultaneously quantify other useful characteristics (e.g., the microbubble half-life). In conclusion, quantitative determination of the level of molecular expression is feasible acoustically using a time-signal intensity curve analytical method based on bubble elimination. PMID:26044707

  6. Electromagnetic Survey

    USGS Multimedia Gallery

    A USGS hydrologist conducts a near-surface electromagnetic induction survey to characterize the shallow earth. The survey was conducted as part of an applied research effort by the USGS Office of Groundwater Branch of Geophysics at Camp Rell, Connecticut, in 2008....

  7. Electromagnetic Survey

    USGS Multimedia Gallery

    USGS hydrologist conducts a broadband electromagnetic survey in New Orleans, Louisiana. The survey was one of several geophysical methods used during USGS applied research on the utility of the multi-channel analysis of surface waves (MASW) seismic method for non-invasive assessment of earthen levee...

  8. Seismic Survey

    USGS Multimedia Gallery

    USGS hydrologists conduct a seismic survey in New Orleans, Louisiana. The survey was one of several geophysical methods used during USGS applied research on the utility of the multi-channel analysis of surface waves (MASW) seismic method (no pictured here) for non-invasive assessment of earthen leve...

  9. THE KMOS{sup 3D} SURVEY: DESIGN, FIRST RESULTS, AND THE EVOLUTION OF GALAXY KINEMATICS FROM 0.7 ? z ? 2.7

    SciTech Connect

    Wisnioski, E.; Förster Schreiber, N. M.; Wuyts, S.; Wuyts, E.; Bandara, K.; Genzel, R.; Bender, R.; Davies, R.; Lang, P.; Mendel, J. T.; Beifiori, A.; Chan, J.; Fabricius, M.; Fudamoto, Y.; Kulkarni, S.; Kurk, J.; Lutz, D.; Wilman, D.; Fossati, M.; Brammer, G.; and others

    2015-02-01

    We present the KMOS{sup 3D} survey, a new integral field survey of over 600 galaxies at 0.7 < z < 2.7 using KMOS at the Very Large Telescope. The KMOS{sup 3D} survey utilizes synergies with multi-wavelength ground- and space-based surveys to trace the evolution of spatially resolved kinematics and star formation from a homogeneous sample over 5 Gyr of cosmic history. Targets, drawn from a mass-selected parent sample from the 3D-HST survey, cover the star formation-stellar mass (M {sub *}) and rest-frame (U – V) – M {sub *} planes uniformly. We describe the selection of targets, the observations, and the data reduction. In the first-year of data we detect H? emission in 191 M {sub *} = 3 × 10{sup 9}-7 × 10{sup 11} M {sub ?} galaxies at z = 0.7-1.1 and z = 1.9-2.7. In the current sample 83% of the resolved galaxies are rotation dominated, determined from a continuous velocity gradient and v {sub rot}/?{sub 0} > 1, implying that the star-forming ''main sequence'' is primarily composed of rotating galaxies at both redshift regimes. When considering additional stricter criteria, the H? kinematic maps indicate that at least ?70% of the resolved galaxies are disk-like systems. Our high-quality KMOS data confirm the elevated velocity dispersions reported in previous integral field spectroscopy studies at z ? 0.7. For rotation-dominated disks, the average intrinsic velocity dispersion decreases by a factor of two from 50 km s{sup –1}at z ? 2.3 to 25 km s{sup –1}at z ? 0.9. Combined with existing results spanning z ? 0-3, we show that disk velocity dispersions follow an evolution that is consistent with the dependence of velocity dispersion on gas fractions predicted by marginally stable disk theory.

  10. Survey over image thresholding techniques and quantitative performance evaluation

    E-print Network

    Aksoy, Selim

    image modalities for nondestructive testing NDT applications, such as ultrasonic images in Ref. 10, eddy thresholding methods from various categories are compared in the context of nondestructive testing applications the thresholding algorithms that perform uniformly better over nonde- structive testing and document image

  11. A Survey of Quantitative Descriptions of Molecular Structure

    PubMed Central

    Guha, Rajarshi; Willighagen, Egon

    2013-01-01

    Numerical characterization of molecular structure is a first step in many computational analysis of chemical structure data. These numerical representations, termed descriptors, come in many forms, ranging from simple atom counts and invariants of the molecular graph to distribution of properties, such as charge, across a molecular surface. In this article we first present a broad categorization of descriptors and then describe applications and toolkits that can be employed to evaluate them. We highlight a number of issues surrounding molecular descriptor calculations such as versioning and reproducibility and describe how some toolkits have attempted to address these problems. PMID:23110530

  12. Quantitative spectroscopy of Deneb

    E-print Network

    Schiller, Florian

    2006-01-01

    Quantitative spectroscopy of luminous BA-type supergiants offers a high potential for modern astrophysics. The degree to which we can rely on quantitative studies of this class of stars as a whole depends on the quality of the analyses for benchmark objects. We constrain the basic atmospheric parameters and fundamental stellar parameters as well as chemical abundances of the prototype A-type supergiant Deneb to unprecedented accuracy (Teff = 8525 +/- 75 K, log(g) = 1.10 +/- 0.05 dex, M_spec = 19 +/- 3 M_sun, L = 1.96 +/- 0.32 *10^5 L_sun, R = 203 +/- 17 R_sun, enrichment with CN-processed matter) by applying a sophisticated hybrid NLTE spectrum synthesis technique which has recently been developed and tested. The study is based on a high-resolution and high-S/N spectrum obtained with the Echelle spectrograph FOCES on the Calar Alto 2.2m telescope. Practically all inconsistencies reported in earlier studies are resolved. Multiple metal ionization equilibria and numerous hydrogen lines from the Balmer, Paschen,...

  13. Quantitative spectroscopy of Deneb

    E-print Network

    Florian Schiller; Norbert Przybilla

    2007-12-01

    Quantitative spectroscopy of luminous BA-type supergiants offers a high potential for modern astrophysics. The degree to which we can rely on quantitative studies of this class of stars as a whole depends on the quality of the analyses for benchmark objects. We constrain the basic atmospheric parameters and fundamental stellar parameters as well as chemical abundances of the prototype A-type supergiant Deneb to unprecedented accuracy (Teff = 8525 +/- 75 K, log(g) = 1.10 +/- 0.05 dex, M_spec = 19 +/- 3 M_sun, L = 1.96 +/- 0.32 *10^5 L_sun, R = 203 +/- 17 R_sun, enrichment with CN-processed matter) by applying a sophisticated hybrid NLTE spectrum synthesis technique which has recently been developed and tested. The study is based on a high-resolution and high-S/N spectrum obtained with the Echelle spectrograph FOCES on the Calar Alto 2.2m telescope. Practically all inconsistencies reported in earlier studies are resolved. Multiple metal ionization equilibria and numerous hydrogen lines from the Balmer, Paschen, Brackett and Pfund series are brought into match simultaneously for the stellar parameter determination. Stellar wind properties are derived from H_alpha line-profile fitting using line-blanketed hydrodynamic non-LTE models. A self-consistent view of Deneb is thus obtained, allowing us to discuss its evolutionary state in detail by comparison with the most recent generation of evolution models for massive stars. (abridged)

  14. São Paulo Megacity Mental Health Survey - a population-based epidemiological study of psychiatric morbidity in the São Paulo metropolitan area: aims, design and field implementation.

    PubMed

    Viana, Maria Carmen; Teixeira, Marlene Galativicis; Beraldi, Fidel; Bassani, Indaiá de Santana; Andrade, Laura Helena

    2009-12-01

    The São Paulo Megacity Mental Health Survey is a population-based cross-sectional survey of psychiatric morbidity, assessing a probabilistic sample of household residents in the São Paulo Metropolitan Area, aged 18 years and over. Respondents were selected from a stratified multistage clustered area probability sample of households, covering all 39 municipalities, without replacement. Respondents were assessed using the World Mental Health Survey version of the World Health Organization Composite International Diagnostic Interview (WMH-CIDI), which was translated and adapted into the Brazilian-Portuguese language. Data was collected between May 2005 and April 2007 by trained lay interviewers. The World Mental Health Survey version of the Composite International Diagnostic Interview comprises clinical and non-clinical sections, arranged as Part I and Part II, producing diagnoses according to the Diagnostic and Statistical Manual of Mental Disorders - Fourth Edition, and the International Classification of Diseases - 10th Revision. Mood, anxiety, impulse-control and substance use disorders, and suicide-related behavior, considered core disorders, as well as socio-demographic information, were assessed in all respondents. Non-clinical modules and non-core clinical sections (obsessive-compulsive disorder, post-traumatic stress disorder, gambling, eating disorders, neurasthenia, pre-menstrual disorders, psychotic symptoms and personality traits) were assessed in a sub-sample (2,942 respondents), composed by all respondents with at least one core disorder and a 25% random sample of those who were non-cases. A total of 5,037 individuals were interviewed, with a global response rate of 81.3%. Saliva samples were collected from 1,801 respondents, with DNA extracted stored pending further investigations. PMID:20098829

  15. Nature as the Most Important Coping Strategy Among Cancer Patients: A Swedish Survey.

    PubMed

    Ahmadi, Fereshteh; Ahmadi, Nader

    2015-08-01

    The authors have conducted a quantitative survey to examine the extent to which the results obtained in a qualitative study among cancer patients in Sweden (Ahmadi, Culture, religion and spirituality in coping: The example of cancer patients in Sweden, Uppsala, Acta Universitatis Upsaliensis, 2006) are applicable to a wider population of cancer patients in this country. In addition to questions relating to the former qualitative study, this survey also references the RCOPE questionnaire (designed by Kenneth I Pargament) in the design of the new quantitative study. In this study, questionnaires were distributed among persons diagnosed with cancer; 2,355 people responded. The results show that nature has been the most important coping method among cancer patients in Sweden. The highest mean value (2.9) is the factor 'nature has been an important resource to you so that you could deal with your illnesses'. Two out of three respondents (68 %) affirm that this method helped them feel significantly better during or after illness. The second highest average (2.8) is the factor 'listening to 'natural music' (birdsong and the wind)'. Two out of three respondents (66 %) answered that this coping method significantly helped them feel better during illness. The third highest average (2.7) is the factor 'to walk or engage in any activity outdoors gives you a spiritual sense'. This survey concerning the role of nature as the most important coping method for cancer patients confirms the result obtained from the previous qualitative studies. PMID:24363200

  16. A Life in the Universe Survey

    ERIC Educational Resources Information Center

    LoPresto, Michael C.; Hubble-Zdanowski, Jennifer

    2012-01-01

    The "Life in the Universe Survey" is a twelve-question assessment instrument. Largely based on the factors of the Drake equation, it is designed to survey students' initial estimates of its factors and to gauge how estimates change with instruction. The survey was used in sections of a seminar course focusing specifically on life in the universe…

  17. 76 FR 34087 - Broad Stakeholder Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-10

    ... SECURITY Broad Stakeholder Survey AGENCY: National Protection and Programs Directorate, DHS. ACTION: 60-day... comments concerning the Broad Stakeholder Survey. DATES: Comments are encouraged and will be accepted until.... The Broad Stakeholder Survey is designed to gather stakeholder feedback on the effectiveness of...

  18. Survey science in pediatric emergency medicine.

    PubMed

    Zonfrillo, Mark R; Wiebe, Douglas J

    2011-05-01

    Surveys conveniently acquire and summarize valuable information from a target population. The specific aims, design, target sample, mode of distribution, data analysis, and inherent limitations of the survey methodology should be carefully considered to maximize the validity of the results. This review provides guidance on the methods and standards necessary to complete sound survey science. PMID:21546814

  19. Case Study: An Assistive Technology Ethics Survey

    E-print Network

    Oishi, Meeko M. K.

    Chapter 9 Case Study: An Assistive Technology Ethics Survey Peter A. Danielson, Holly Longstaff describes the online N-Reasons Ethics and Assistive Technology survey designed to address key ethical issues in a multidisciplinary workshop on assistive technologies. The survey focused on each of the four workshop topics

  20. PEP surveying procedures and equipment

    SciTech Connect

    Linker, F.

    1982-06-01

    The PEP Survey and Alignment System, which employs both laser-based and optical survey methods, is described. The laser is operated in conjunction with the Tektronix 4051 computer and surveying instruments such as ARM and SAM, system which is designed to automate data input, reduction, and production of alignment instructions. The laser system is used when surveying ring quadrupoles, main bend magnets, sextupoles, and is optional when surveying RF cavities and insertion quadrupoles. Optical methods usually require that data be manually entered into the computer for alignment, but in some cases, an element can be aligned using nominal values of fiducial locations without use of the computer. Optical surveying is used in the alignment of NIT and SIT, low field bend magnets, wigglers, RF cavities, and insertion quadrupoles.

  1. Assessing the Status of Wild Felids in a Highly-Disturbed Commercial Forest Reserve in Borneo and the Implications for Camera Trap Survey Design

    PubMed Central

    Wearn, Oliver R.; Rowcliffe, J. Marcus; Carbone, Chris; Bernard, Henry; Ewers, Robert M.

    2013-01-01

    The proliferation of camera-trapping studies has led to a spate of extensions in the known distributions of many wild cat species, not least in Borneo. However, we still do not have a clear picture of the spatial patterns of felid abundance in Southeast Asia, particularly with respect to the large areas of highly-disturbed habitat. An important obstacle to increasing the usefulness of camera trap data is the widespread practice of setting cameras at non-random locations. Non-random deployment interacts with non-random space-use by animals, causing biases in our inferences about relative abundance from detection frequencies alone. This may be a particular problem if surveys do not adequately sample the full range of habitat features present in a study region. Using camera-trapping records and incidental sightings from the Kalabakan Forest Reserve, Sabah, Malaysian Borneo, we aimed to assess the relative abundance of felid species in highly-disturbed forest, as well as investigate felid space-use and the potential for biases resulting from non-random sampling. Although the area has been intensively logged over three decades, it was found to still retain the full complement of Bornean felids, including the bay cat Pardofelis badia, a poorly known Bornean endemic. Camera-trapping using strictly random locations detected four of the five Bornean felid species and revealed inter- and intra-specific differences in space-use. We compare our results with an extensive dataset of >1,200 felid records from previous camera-trapping studies and show that the relative abundance of the bay cat, in particular, may have previously been underestimated due to the use of non-random survey locations. Further surveys for this species using random locations will be crucial in determining its conservation status. We advocate the more wide-spread use of random survey locations in future camera-trapping surveys in order to increase the robustness and generality of inferences that can be made. PMID:24223717

  2. Quantitative metamaterial property extraction

    E-print Network

    Schurig, David

    2015-01-01

    We examine an extraction model for metamaterials, not previously reported, that gives precise, quantitative and causal representation of S parameter data over a broad frequency range, up to frequencies where the free space wavelength is only a modest factor larger than the unit cell dimension. The model is comprised of superposed, slab shaped response regions of finite thickness, one for each observed resonance. The resonance dispersion is Lorentzian and thus strictly causal. This new model is compared with previous models for correctness likelihood, including an appropriate Occam's factor for each fit parameter. We find that this new model is by far the most likely to be correct in a Bayesian analysis of model fits to S parameter simulation data for several classic metamaterial unit cells.

  3. Surveying System

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Sunrise Geodetic Surveys are setting up their equipment for a town survey. Their equipment differs from conventional surveying systems that employ transit rod and chain to measure angles and distances. They are using ISTAC Inc.'s Model 2002 positioning system, which offers fast accurate surveying with exceptional signals from orbiting satellites. The special utility of the ISTAC Model 2002 is that it can provide positioning of the highest accuracy from Navstar PPS signals because it requires no knowledge of secret codes. It operates by comparing the frequency and time phase of a Navstar signal arriving at one ISTAC receiver with the reception of the same set of signals by another receiver. Data is computer processed and translated into three dimensional position data - latitude, longitude and elevation.

  4. A Quantitative Analysis of the Extrinsic and Intrinsic Turnover Factors of Relational Database Support Professionals

    ERIC Educational Resources Information Center

    Takusi, Gabriel Samuto

    2010-01-01

    This quantitative analysis explored the intrinsic and extrinsic turnover factors of relational database support specialists. Two hundred and nine relational database support specialists were surveyed for this research. The research was conducted based on Hackman and Oldham's (1980) Job Diagnostic Survey. Regression analysis and a univariate ANOVA…

  5. Mining the Digital Hamburg/ESO Objective Prism Survey

    E-print Network

    N. Christlieb; L. Wisotzki; D. Reimers

    2000-10-23

    We report on the exploitation of the stellar content of the Hamburg/ESO objective prism survey by quantitative selection methods, such as automatic spectral classification, and first results obtained.

  6. Quantitative Mineralogical Characterization of Oregon Erionite

    NASA Astrophysics Data System (ADS)

    Dogan, A.; Dogan, M.; Ballirano, P.

    2006-12-01

    Erionite has been classified as Group-I Human Carcinogen by the IARC Working Group. Fibrogenetic potential of erionite varies from low to high yield of mesothelioma. This may require quantitative characterization of physicochemical properties of erionite before any experimental design. The toxicity of the mineral is such that quantitative characterization of erionite is extremely important. Yet, often the erionite specimens were incompletely or incorrectly characterized throwing doubt on the results of the work. For example, none of the Turkish erionite published until recently had balance error (E%) less than 10%, and Mg cation of the type specimen of erionite-Ca from Maze, Niigita Prefecture, Japan is more than 0.8. In the present study, erionite sample near Rome, Oregon have been quantitatively characterized using powder x-ray diffraction, Reitveld refinement, scanning electron microscopy, energy dispersive spectroscopy, inductively coupled plasma - mass spectroscopy, and Massbauer spectroscopy. The cell parameters of the erionite-K from Oregon is computed as a=13.2217(2) Å and c=15.0671 Å; chemical composition of the erionite as major oxides, rare earth elements and other trace elements, are characterized quantitatively. Crystal chemistries of the erionite are computed based upon the quidelines of the IMAA zeolite report of 1997.

  7. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  8. Robotic Surveying

    SciTech Connect

    Suzy Cantor-McKinney; Michael Kruzic

    2007-03-01

    ZAPATA ENGINEERING challenged our engineers and scientists, which included robotics expertise from Carnegie Mellon University, to design a solution to meet our client's requirements for rapid digital geophysical and radiological data collection of a munitions test range with no down-range personnel. A prime concern of the project was to minimize exposure of personnel to unexploded ordnance and radiation. The field season was limited by extreme heat, cold and snow. Geographical Information System (GIS) tools were used throughout this project to accurately define the limits of mapped areas, build a common mapping platform from various client products, track production progress, allocate resources and relate subsurface geophysical information to geographical features for use in rapidly reacquiring targets for investigation. We were hopeful that our platform could meet the proposed 35 acres per day, towing both a geophysical package and a radiological monitoring trailer. We held our breath and crossed our fingers as the autonomous Speedrower began to crawl across the playa lakebed. We met our proposed production rate, and we averaged just less than 50 acres per 12-hour day using the autonomous platform with a path tracking error of less than +/- 4 inches. Our project team mapped over 1,800 acres in an 8-week (4 days per week) timeframe. The expertise of our partner, Carnegie Mellon University, was recently demonstrated when their two autonomous vehicle entries finished second and third at the 2005 Defense Advanced Research Projects Agency (DARPA) Grand Challenge. 'The Grand Challenge program was established to help foster the development of autonomous vehicle technology that will some day help save the lives of Americans who are protecting our country on the battlefield', said DARPA Grand Challenge Program Manager, Ron Kurjanowicz. Our autonomous remote-controlled vehicle (ARCV) was a modified New Holland 2550 Speedrower retrofitted to allow the machine-actuated functions to be controlled by an onboard computer. The computer-controlled Speedrower was developed at Carnegie Mellon University to automate agricultural harvesting. Harvesting tasks require the vehicle to cover a field using minimally overlapping rows at slow speeds in a similar manner to geophysical data acquisition. The Speedrower had demonstrated its ability to perform as it had already logged hundreds of acres of autonomous harvesting. This project is the first use of autonomous robotic technology on a large-scale for geophysical surveying.

  9. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  10. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  11. 23 CFR Appendix A to Part 1340 - Sample Design

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...an example of a complying survey design and to provide guidance...is recommended that State surveys of safety belt use be designed by qualified survey statisticians. I. Sample...guideline) be eligible for sampling. B. First Stage:...

  12. Questions for Surveys

    PubMed Central

    Schaeffer, Nora Cate; Dykema, Jennifer

    2011-01-01

    We begin with a look back at the field to identify themes of recent research that we expect to continue to occupy researchers in the future. As part of this overview, we characterize the themes and topics examined in research about measurement and survey questions published in Public Opinion Quarterly in the past decade. We then characterize the field more broadly by highlighting topics that we expect to continue or to grow in importance, including the relationship between survey questions and the total survey error perspective, cognitive versus interactional approaches, interviewing practices, mode and technology, visual aspects of question design, and culture. Considering avenues for future research, we advocate for a decision-oriented framework for thinking about survey questions and their characteristics. The approach we propose distinguishes among various aspects of question characteristics, including question topic, question type and response dimension, conceptualization and operationalization of the target object, question structure, question form, response categories, question implementation, and question wording. Thinking about question characteristics more systematically would allow study designs to take into account relationships among these characteristics and identify gaps in current knowledge. PMID:24970951

  13. 2012 Mask Industry Survey

    NASA Astrophysics Data System (ADS)

    Malloy, Matt; Litt, Lloyd C.

    2012-11-01

    A survey supported by SEMATECH and administered by David Powell Consulting was sent to semiconductor industry leaders to gather information about the mask industry as an objective assessment of its overall condition. The survey was designed with the input of semiconductor company mask technologists and merchant mask suppliers. 2012 marks the 11th consecutive year for the mask industry survey. This year's survey and reporting structure are similar to those of the previous years with minor modifications based on feedback from past years and the need to collect additional data on key topics. Categories include general mask information, mask processing, data and write time, yield and yield loss, delivery times, and maintenance and returns. Within each category are multiple questions that result in a detailed profile of both the business and technical status of the mask industry. Results, initial observations, and key comparisons between the 2011 and 2012 survey responses are shown here, including multiple indications of a shift towards the manufacturing of higher end photomasks.

  14. CHINA HEALTH AND NUTRITION SURVEY

    EPA Science Inventory

    The China Health and Nutrition Survey is designed to examine the effects of health, nutrition, and family planning policies and programs as they have been implemented by national and local governments. It is designed to examine how both the social and economic transformation of C...

  15. National Survey of Public School Teachers.

    ERIC Educational Resources Information Center

    2001

    This report presents national survey results of public school teachers' opinions on the relationship between interior design and academic performance. The 1,050 teachers surveyed reveal that they recognize the relationship between interior design and academic achievement and that most teachers see the advantages of classroom carpeting relative to…

  16. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  17. Quantitive DNA Fiber Mapping

    SciTech Connect

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  18. Quantitative Electron Nanodiffraction.

    SciTech Connect

    Spence, John

    2015-01-30

    This Final report summarizes progress under this award for the final reporting period 2002 - 2013 in our development of quantitive electron nanodiffraction to materials problems, especially devoted to atomistic processes in semiconductors and electronic oxides such as the new artificial oxide multilayers, where our microdiffraction is complemented with energy-loss spectroscopy (ELNES) and aberration-corrected STEM imaging (9). The method has also been used to map out the chemical bonds in the important GaN semiconductor (1) used for solid state lighting, and to understand the effects of stacking sequence variations and interfaces in digital oxide superlattices (8). Other projects include the development of a laser-beam Zernike phase plate for cryo-electron microscopy (5) (based on the Kapitza-Dirac effect), work on reconstruction of molecular images using the scattering from many identical molecules lying in random orientations (4), a review article on space-group determination for the International Tables on Crystallography (10), the observation of energy-loss spectra with millivolt energy resolution and sub-nanometer spatial resolution from individual point defects in an alkali halide, a review article for the Centenary of X-ray Diffration (17) and the development of a new method of electron-beam lithography (12). We briefly summarize here the work on GaN, on oxide superlattice ELNES, and on lithography by STEM.

  19. National Nursing Home Survey

    MedlinePLUS

    ... please visit this page: About CDC.gov . National Nursing Home Survey National Nursing Home Survey About NNHS What's New Survey Methodology, ... Data Files Long-term Care Medication Data National Nursing Assistant Survey Survey Publications and Products Listserv Related ...

  20. Survey of digital filtering

    NASA Technical Reports Server (NTRS)

    Nagle, H. T., Jr.

    1972-01-01

    A three part survey is made of the state-of-the-art in digital filtering. Part one presents background material including sampled data transformations and the discrete Fourier transform. Part two, digital filter theory, gives an in-depth coverage of filter categories, transfer function synthesis, quantization and other nonlinear errors, filter structures and computer aided design. Part three presents hardware mechanization techniques. Implementations by general purpose, mini-, and special-purpose computers are presented.

  1. Sample Design.

    ERIC Educational Resources Information Center

    Ross, Kenneth N.

    1987-01-01

    This article considers various kinds of probability and non-probability samples in both experimental and survey studies. Throughout, how a sample is chosen is stressed. Size alone is not the determining consideration in sample selection. Good samples do not occur by accident; they are the result of a careful design. (Author/JAZ)

  2. QUANTITATIVE 15N NMR SPECTROSCOPY

    EPA Science Inventory

    Line intensities in 15N NMR spectra are strongly influenced by spin-lattice and spin-spin relaxation times, relaxation mechanisms and experimental conditions. Special care has to be taken in using 15N spectra for quantitative purposes. Quantitative aspects are discussed for the 1...

  3. The Imaging and Slitless Spectroscopy Instrument for Surveys (ISSIS) for the World Space Observatory--Ultraviolet (WSO-UV): optical design, performance and verification tests.

    NASA Astrophysics Data System (ADS)

    Gómez de Castro, A. I.; Perea, B.; Sánchez, N.; Chirivella, J.; Seijas, J.

    2015-05-01

    ISSIS is the instrument for imaging and slitless spectroscopy on-board WSO-UV. The baseline for ISSIS design, as approved at the PDR held in May 2012, consists of two acquisition channels, both of them provided with photon counting detectors with Micro-Channel Plates (MCP). These two channels are named the Far Ultraviolet (FUV) Channel covering the 1150-1750 Å wavelength range and the Near Ultraviolet (NUV) Channel in the 1850-3200 Å range. In this work, we present the current ISSIS design and its main characteristics. We present the main performance verification for ISSIS to ensure that the current design of ISSIS fulfils the scientific requirements and to ensure the feasibility of the in-flight calibration. We also define the facilities and technical characteristics for realizing the tests.

  4. Quantitative spectroscopy of Deneb

    NASA Astrophysics Data System (ADS)

    Schiller, F.; Przybilla, N.

    2008-03-01

    Context: Quantitative spectroscopy of luminous BA-type supergiants offers a high potential for modern astrophysics. Detailed studies allow the evolution of massive stars, galactochemical evolution, and the cosmic distance scale to be constrained observationally. Aims: A detailed and comprehensive understanding of the atmospheres of BA-type supergiants is required in order to use this potential properly. The degree to which we can rely on quantitative studies of this class of stars as a whole depends on the quality of the analyses for benchmark objects. We constrain the basic atmospheric parameters and fundamental stellar parameters, as well as chemical abundances of the prototype A-type supergiant Deneb to unprecedented accuracy by applying a sophisticated analysis methodology, which has recently been developed and tested. Methods: The analysis is based on high-S/N and high-resolution spectra in the visual and near-IR. Stellar parameters and abundances for numerous astrophysically interesting elements are derived from synthesis of the photospheric spectrum using a hybrid non-LTE technique, i.e. line-blanketed LTE model atmospheres and non-LTE line formation. Multiple metal ionisation equilibria and numerous hydrogen lines from the Balmer, Paschen, Brackett, and Pfund series are utilised simultaneously for the stellar parameter determination. The stellar wind properties are derived from H? line-profile fitting using line-blanketed hydrodynamic non-LTE models. Further constraints come from matching the photospheric spectral energy distribution from the UV to the near-IR L band. Results: The atmospheric parameters of Deneb are tightly constrained: effective temperature T_eff = 8525±75 K, surface gravity log g = 1.10±0.05, microturbulence ? = 8±1 km s-1, macroturbulence, and projected rotational velocity v sin i are both 20 ± 2 km s-1. The abundance analysis gives helium enrichment by 0.10 dex relative to solar and an N/C ratio of 4.44 ± 0.84 (mass fraction), implying strong mixing with CN-processed matter. The heavier elements are consistently underabundant by 0.20 dex compared to solar. Peculiar abundance patterns, which were suggested in previous analyses cannot be confirmed. Accounting for non-LTE effects is essential for removing systematic trends in the abundance determination, for minimising statistical 1?-uncertainties to ?10-20% and for establishing all ionisation equilibria at the same time. Conclusions: A luminosity of (1.96 ± 0.32)×105 L?, a radius of 203 ± 17 R_?, and a current mass of 19 ± 4 M? are derived. Comparison with stellar evolution predictions suggests that Deneb started as a fast-rotating late O-type star with M^ZAMS? 23 M_? on the main sequence and is currently evolving to the red supergiant stage. Based on observations collected at the Centro Astronómico Hispano Alemán (CAHA) at Calar Alto, operated jointly by the Max-Planck Institut für Astronomie and the Instituto de Astrofisica de Andalucia (CSIC). Appendix A is only available in electronic form at http://www.aanda.org

  5. Visual Design Principles: An Empirical Study of Design Lore

    ERIC Educational Resources Information Center

    Kimball, Miles A.

    2013-01-01

    Many books, designers, and design educators talk about visual design principles such as balance, contrast, and alignment, but with little consistency. This study uses empirical methods to explore the lore surrounding design principles. The study took the form of two stages: a quantitative literature review to determine what design principles are…

  6. Quantitative Spectroscopy of Deneb

    NASA Astrophysics Data System (ADS)

    Schiller, Florian; Przybilla, N.

    We use the visually brightest A-type supergiant Deneb (A2 Ia) as benchmark for testing a spectro- scopic analysis technique developed for quantitative studies of BA-type supergiants. Our NLTE spectrum synthesis technique allows us to derive stellar parameters and elemental abundances with unprecedented accuracy. The study is based on a high-resolution and high-S/N spectrum obtained with the Echelle spectrograph FOCES on the Calar Alto 2.2 m telescope. Practically all inconsistencies reported in earlier studies are resolved. A self-consistent view of Deneb is thus obtained, allowing us to discuss its evolutionary state in detail by comparison with the most recent generation of evolution models for massive stars. The basic atmospheric parameters Teff = 8525 ± 75 K and log g = 1.10 ± 0.05 dex (cgs) and the distance imply the following fundamental parameters for Deneb: M spec = 17 ± 3 M? , L = 1.77 ± 0.29 · 105 L? and R = 192 ± 16 R? . The derived He and CNO abundances indicate mixing with nuclear processed matter. The high N/C ratio of 4.64 ± 1.39 and a N/O ratio of 0.88 ± 0.07 (mass fractions) could in principle be explained by evolutionary models with initially very rapid rotation. A mass of ˜ 22 M? is implied for the progenitor on the zero-age main se- quence, i.e. it was a late O-type star. Significant mass-loss has occurred, probably enhanced by pronounced centrifugal forces. The observational constraints favour a scenario for the evolu- tion of Deneb where the effects of rotational mixing may be amplified by an interaction with a magnetic field. Analogous analyses of such highly luminous BA-type supergiants will allow for precision studies of different galaxies in the Local Group and beyond.

  7. WESF natural phenomena hazards survey

    SciTech Connect

    Wagenblast, G.R., Westinghouse Hanford

    1996-07-01

    A team of engineers conducted a systematic natural hazards phenomena (NPH) survey for the 225-B Waste Encapsulation and Storage Facility (WESF). The survey is an assessment of the existing design documentation to serve as the structural design basis for WESF, and the Interim Safety Basis (ISB). The lateral force resisting systems for the 225-B building structures, and the anchorages for the WESF safety related systems were evaluated. The original seismic and other design analyses were technically reviewed. Engineering judgment assessments were made of the probability of NPH survival, including seismic, for the 225-B structures and WESF safety systems. The method for the survey is based on the experience of the investigating engineers,and documented earthquake experience (expected response) data.The survey uses knowledge on NPH performance and engineering experience to determine the WESF strengths for NPH resistance, and uncover possible weak links. The survey, in general, concludes that the 225-B structures and WESF safety systems are designed and constructed commensurate with the current Hanford Site design criteria.

  8. Monitoring and design of stormwater control basins

    USGS Publications Warehouse

    Veenhuis, J.E.; Parrish, J.H.; Jennings, M.E.

    1989-01-01

    The City of Austin, Texas, has played a pioneering role in the control of urban nonpoint source pollution by enacting watershed and stormwater ordinances, overseeing detailed monitoring programs, and improving design criteria for stormwater control methods. The effectiveness of the methods used in Austin, and perhaps in other areas of the United States, to protect urban water resources has not yet been fully established. Therefore, detailed monitoring programs capable of quantitatively determining the effectiveness of control methods and of stormwater ordinances, are required. The purpose of this report is to present an overview of the City of Austin's stormwater monitoring program, including previous monitoring programs with the U.S. Environmental Protection Agency and the U.S. Geological Survey, and to describe the relation of monitoring to design of stormwater control basins.

  9. Systematic survey of the design, statistical analysis, and reporting of studies published in the 2008 volume of the Journal of Cerebral Blood Flow and Metabolism.

    PubMed

    Vesterinen, Hanna M; Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich

    2011-04-01

    Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication. PMID:21157472

  10. Quantitative analysis of spirality in elliptical galaxies

    NASA Astrophysics Data System (ADS)

    Dojcsak, Levente; Shamir, Lior

    2014-04-01

    We use an automated galaxy morphology analysis method to quantitatively measure the spirality of galaxies classified manually as elliptical. The data set used for the analysis consists of 60,518 galaxy images with redshift obtained by the Sloan Digital Sky Survey (SDSS) and classified manually by Galaxy Zoo, as well as the RC3 and NA10 catalogues. We measure the spirality of the galaxies by using the Ganalyzer method, which transforms the galaxy image to its radial intensity plot to detect galaxy spirality that is in many cases difficult to notice by manual observation of the raw galaxy image. Experimental results using manually classified elliptical and S0 galaxies with redshift <0.3 suggest that galaxies classified manually as elliptical and S0 exhibit a nonzero signal for the spirality. These results suggest that the human eye observing the raw galaxy image might not always be the most effective way of detecting spirality and curves in the arms of galaxies.

  11. Surveying Altimeter

    USGS Multimedia Gallery

    This type of altimeter is a precision aneroid barometer that translates barometric (air) pressure into altitude. Temperature compensation calculations, as well as calculations to account for diurnal barometric change need to be recorded during use. In the field the survey altimeter first must be pla...

  12. Soil Surveys

    NASA Technical Reports Server (NTRS)

    1985-01-01

    An accurate method of surveying the soil was developed by NASA and the Department of Agriculture. The method involves using ground penetrating radar to produce subsurface graphs. By examining printouts from the system's recorder, scientists can determine whether a site is appropriate for building, etc.

  13. Complexity Survey.

    ERIC Educational Resources Information Center

    Gordon, Sandra L.; Anderson, Beth C.

    To determine whether consensus existed among teachers about the complexity of common classroom materials, a survey was administered to 66 pre-service and in-service kindergarten and prekindergarten teachers. Participants were asked to rate 14 common classroom materials as simple, complex, or super-complex. Simple materials have one obvious part,…

  14. Survey of Geothermal Solid Toxic Waste

    SciTech Connect

    Darnell, A.J.; Gay, R.L.; Klenck, M.M.; Nealy, C.L.

    1982-09-30

    This is an early survey and analysis of the types and quantities of solid toxic wastes to be expected from geothermal power systems, particularly at the Salton Sea, California. It includes a literature search (48 references/citations), descriptions of methods for handling wastes, and useful quantitative values. It also includes consideration of reclaiming metals and mineral byproducts from geothermal power systems. (DJE 2005)

  15. Biomedical imaging ontologies: A survey and proposal for future work

    PubMed Central

    Smith, Barry; Arabandi, Sivaram; Brochhausen, Mathias; Calhoun, Michael; Ciccarese, Paolo; Doyle, Scott; Gibaud, Bernard; Goldberg, Ilya; Kahn, Charles E.; Overton, James; Tomaszewski, John; Gurcan, Metin

    2015-01-01

    Background: Ontology is one strategy for promoting interoperability of heterogeneous data through consistent tagging. An ontology is a controlled structured vocabulary consisting of general terms (such as “cell” or “image” or “tissue” or “microscope”) that form the basis for such tagging. These terms are designed to represent the types of entities in the domain of reality that the ontology has been devised to capture; the terms are provided with logical definitions thereby also supporting reasoning over the tagged data. Aim: This paper provides a survey of the biomedical imaging ontologies that have been developed thus far. It outlines the challenges, particularly faced by ontologies in the fields of histopathological imaging and image analysis, and suggests a strategy for addressing these challenges in the example domain of quantitative histopathology imaging. Results and Conclusions: The ultimate goal is to support the multiscale understanding of disease that comes from using interoperable ontologies to integrate imaging data with clinical and genomics data. PMID:26167381

  16. A Survey of Controlled Experiments in Software Engineering

    E-print Network

    A Survey of Controlled Experiments in Software Engineering Dag I.K. Sjøberg, Member, IEEE Computer articles published in 12 leading software engineering journals and conferences in the decade from 1993 performed one or more software engineering tasks. This survey quantitatively characterizes the topics

  17. 7 CFR 1737.31 - Area Coverage Survey (ACS).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 11 2012-01-01 2012-01-01 false Area Coverage Survey (ACS). 1737.31 Section 1737.31... Studies-Area Coverage Survey and Loan Design § 1737.31 Area Coverage Survey (ACS). (a) The Area Coverage Survey (ACS) is a market forecast of service requirements of subscribers in a proposed service area....

  18. 7 CFR 1737.31 - Area Coverage Survey (ACS).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Area Coverage Survey (ACS). 1737.31 Section 1737.31... Studies-Area Coverage Survey and Loan Design § 1737.31 Area Coverage Survey (ACS). (a) The Area Coverage Survey (ACS) is a market forecast of service requirements of subscribers in a proposed service area....

  19. 7 CFR 1737.31 - Area Coverage Survey (ACS).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 11 2013-01-01 2013-01-01 false Area Coverage Survey (ACS). 1737.31 Section 1737.31... Studies-Area Coverage Survey and Loan Design § 1737.31 Area Coverage Survey (ACS). (a) The Area Coverage Survey (ACS) is a market forecast of service requirements of subscribers in a proposed service area....

  20. Quantitative historical hydrology in Europe

    NASA Astrophysics Data System (ADS)

    Benito, G.; Brázdil, R.; Herget, J.; Machado, M. J.

    2015-04-01

    In the last decades, the quantification of flood hydrological characteristics (peak discharge, hydrograph shape, and runoff volume) from documentary evidence has gained scientific recognition as a method to lengthen flood records of rare and extreme events. This paper describes the methodological evolution of the quantitative historical hydrology under the influence of developments in hydraulics and statistics. In the 19th century, discharge calculations based on flood marks was the only source of hydrological data for engineering design, but later was left aside on favour of systematic gauge records and conventional hydrological procedures. In the last two decades, there is growing scientific and public interest to understand long-term patterns of rare floods, maintain the flood heritage and memory of extremes, and to develop methods for deterministic and statistical application to different scientific and engineering problems. A compilation of 45 case studies across Europe with reconstructed discharges demonstrates that (1) in most cases present flood magnitudes are not unusual within the context of the last millennium, although recent floods may exceed past floods in some temperate European rivers (e.g. the Vltava and Po rivers), (2) frequency of extreme floods have decreased since the 1950s, although some rivers (e.g. the Gardon and Ouse rivers) show a reactivation of rare events over the last two decades. There is a great potential of gaining understanding of individual extreme events based on a combined multiproxy approach (palaeoflood and documentary records) providing high-resolution time flood series and their environmental and climatic changes; and to develop non-systematic and non-stationary statistical models based on relations of past floods with external and internal covariates under natural low-frequency climate variability.

  1. Quantitative historical hydrology in Europe

    NASA Astrophysics Data System (ADS)

    Benito, G.; Brázdil, R.; Herget, J.; Machado, M. J.

    2015-08-01

    In recent decades, the quantification of flood hydrological characteristics (peak discharge, hydrograph shape, and runoff volume) from documentary evidence has gained scientific recognition as a method to lengthen flood records of rare and extreme events. This paper describes the methodological evolution of quantitative historical hydrology under the influence of developments in hydraulics and statistics. In the 19th century, discharge calculations based on flood marks were the only source of hydrological data for engineering design, but were later left aside in favour of systematic gauge records and conventional hydrological procedures. In the last two decades, there has been growing scientific and public interest in understanding long-term patterns of rare floods, in maintaining the flood heritage and memory of extremes, and developing methods for deterministic and statistical application to different scientific and engineering problems. A compilation of 46 case studies across Europe with reconstructed discharges demonstrates that (1) in most cases present flood magnitudes are not unusual within the context of the last millennium, although recent floods may exceed past floods in some temperate European rivers (e.g. the Vltava and Po rivers); (2) the frequency of extreme floods has decreased since the 1950s, although some rivers (e.g. the Gardon and Ouse rivers) show a reactivation of rare events over the last two decades. There is a great potential for gaining understanding of individual extreme events based on a combined multiproxy approach (palaeoflood and documentary records) providing high-resolution time flood series and their environmental and climatic changes; and for developing non-systematic and non-stationary statistical models based on relations of past floods with external and internal covariates under natural low-frequency climate variability.

  2. Environmental distribution of two widespread uncultured freshwater Euryarchaeota clades unveiled by specific primers and quantitative PCR.

    PubMed

    Restrepo-Ortiz, Claudia X; Casamayor, Emilio O

    2013-12-01

    Quantitative environmental distribution of two widely distributed uncultured freshwater Euryarchaeota with unknown functional role was explored by newly designed quantitative PCR primers targeting the 16S rRNA gene of clades Miscellaneous Euryarchaeota Group (MEG, containing the groups pMC2A384 and VALII/Eury4) and Deep-Sea Euryarchaeotal Groups (DSEG, targeting the cluster named VALIII containing the DHVE-3/DSEG, BC07-2A-27/DSEG-3 and DSEG-2 groups), respectively. The summer surface plankton of 28 lakes was analysed, and one additional dimictic deep alpine lake, Lake Redon, was temporally and vertically surveyed covering seasonal limnological variability. A trophic range between 0.2 and 5.2??g?l(-1) Chl a, and pH span from 3.8 to 9.5 was explored at altitudes between 632 and 2590?m above sea level. The primers showed to be highly selective with c. 85% coverage and 100% specificity. Only pH significantly explained the changes observed in gene abundances and environment. In Lake Redon, DSEG bloomed in deep stratified waters both in summer and early spring, and MEG at intermediate depths during the ice-cover period. Overall, MEG and DSEG showed a differential ecological distribution although correlational analyses indicated lack of coupling of both Euryarchaeota with phytoplankton (chlorophyll a). However, an intriguing positive and significant relationship was found between DSEG and putative ammonia oxidizing thaumarchaeota. PMID:24249295

  3. Quantitative Genetics, House Sparrows and a Multivariate

    E-print Network

    Steinsland, Ingelin

    NTNU Quantitative Genetics, House Sparrows and a Multivariate Gaussian Markov Random Field Model Quantitative Genetics, House Sparrows and a Multivariate Gaussian Markov Random Field Model ­ p.1/25 #12;NTNU Quantitative Genetics Quantitative genetics is the study of quantitative characters. It is based

  4. Quantitative Portraits of Lexical Elements Kyo Kageura

    E-print Network

    Quantitative Portraits of Lexical Elements Kyo Kageura Human and Social Information Research quantitative "weighting" of lexical elements are defined, and then draws, quantitative portraits of a few by drawing quantitative portraits of some lexical items using the quantitative measures. 2 Texts and lexica

  5. Quantitatively Probing the Al Distribution in Zeolites

    SciTech Connect

    Vjunov, Aleksei; Fulton, John L.; Huthwelker, Thomas; Pin, Sonia; Mei, Donghai; Schenter, Gregory K.; Govind, Niranjan; Camaioni, Donald M.; Hu, Jian Z.; Lercher, Johannes A.

    2014-06-11

    The degree of substitution of Si4+ by Al3+ in the oxygen-terminated tetrahedra (Al T-sites) of zeolites determines the concentration of ion-exchange and Brønsted acid sites. As the location of the tetrahedra and the associated subtle variations in bond angles influence the acid strength, quantitative information about Al T-sites in the framework is critical to rationalize catalytic properties and to design new catalysts. A quantitative analysis is reported that uses a combination of extended X-ray absorption fine structure (EXAFS) analysis and 27Al MAS NMR spectroscopy supported by DFT-based molecular dynamics simulations. To discriminate individual Al atoms, sets of ab initio EXAFS spectra for various T-sites are generated from DFT-based molecular dynamics simulations allowing quantitative treatment of the EXAFS single- and multiple-photoelectron scattering processes out to 3-4 atom shells surrounding the Al absorption center. It is observed that identical zeolite types show dramatically different Al-distributions. A preference of Al for T-sites that are part of one or more 4-member rings in the framework over those T-sites that are part of only 5- and 6-member rings in the HBEA150 sample has been determined from a combination of these methods. This work was supported by the U. S. Department of Energy (DOE), Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences.

  6. Quantitative analysis of colony morphology in yeast

    PubMed Central

    Ruusuvuori, Pekka; Lin, Jake; Scott, Adrian C.; Tan, Zhihao; Sorsa, Saija; Kallio, Aleksi; Nykter, Matti; Yli-Harja, Olli; Shmulevich, Ilya; Dudley, Aimée M.

    2014-01-01

    Microorganisms often form multicellular structures such as biofilms and structured colonies that can influence the organism’s virulence, drug resistance, and adherence to medical devices. Phenotypic classification of these structures has traditionally relied on qualitative scoring systems that limit detailed phenotypic comparisons between strains. Automated imaging and quantitative analysis have the potential to improve the speed and accuracy of experiments designed to study the genetic and molecular networks underlying different morphological traits. For this reason, we have developed a platform that uses automated image analysis and pattern recognition to quantify phenotypic signatures of yeast colonies. Our strategy enables quantitative analysis of individual colonies, measured at a single time point or over a series of time-lapse images, as well as the classification of distinct colony shapes based on image-derived features. Phenotypic changes in colony morphology can be expressed as changes in feature space trajectories over time, thereby enabling the visualization and quantitative analysis of morphological development. To facilitate data exploration, results are plotted dynamically through an interactive Yeast Image Analysis web application (YIMAA; http://yimaa.cs.tut.fi) that integrates the raw and processed images across all time points, allowing exploration of the image-based features and principal components associated with morphological development. PMID:24447135

  7. UKIRT Widefield Infrared Survey for Fe$^+$

    E-print Network

    Lee, Jae-Joon; Lee, Yong-Hyun; Lee, Ho-Gyu; Shinn, Jong-Ho; Kim, Hyun-Jeong; Kim, Yesol; Pyo, Tae-Soo; Moon, Dae-Sik; Yoon, Sung-Chul; Chun, Moo-Young; Froebrich, Dirk; Davis, Chris J; Varricatt, Watson P; Kyeong, Jaemann; Hwang, Narae; Park, Byeong-Gon; Lee, Myung Gyoon; Lee, Hyung Mok; Ishiguro, Masateru

    2014-01-01

    The United Kingdom Infrared Telescope (UKIRT) Widefield Infrared Survey for Fe$^+$ (UWIFE) is a 180 deg$^2$ imaging survey of the first Galactic quadrant (7$^{\\circ}$ line. The [Fe II] 1.644 {\\mu}m emission is a good tracer of dense, shock-excited gas, and the survey will probe violent environments around stars: star-forming regions, evolved stars, and supernova remnants, among others. The UWIFE survey is designed to complement the existing UKIRT Widefield Infrared Survey for H2 (UWISH2; Froebrich et al. 2011). The survey will also complement existing broad-band surveys. The observed images have a nominal 5{\\sigma} detection limit of 18.7 mag for point sources, with the median seeing of 0.83". For extended sources, we estimate surface brightness limit of 8.1 x 10$^{-20}$ W m$^{-2}$ arcsec$^{-2}$ . In this paper, we present the overview and preliminary results of this survey.

  8. Multiple Surveys of Students and Survey Fatigue

    ERIC Educational Resources Information Center

    Porter, Stephen R.; Whitcomb, Michael E.; Weitzer, William H.

    2004-01-01

    This chapter reviews the literature on survey fatigue and summarizes a research project that indicates that administering multiple surveys in one academic year can significantly suppress response rates in later surveys. (Contains 4 tables.)

  9. EASTERN LAKE SURVEY-PHASE II AND NATIONAL STREAM SURVEY-PHASE I PROCESSING LABORATORY OPERATIONS REPORT

    EPA Science Inventory

    The National Surface Water Survey was designed to characterize surface water chemistry in regions of the United States believed to be potentially sensitive to acidic deposition. The National Stream Survey was a synoptic survey designed to quantify the chemistry of streams in area...

  10. Utilizing qualitative methods in survey design: examining Texas cattle producers' intent to participate in foot-and-mouth disease detection and control.

    PubMed

    Delgado, Amy H; Norby, Bo; Dean, Wesley R; McIntosh, W Alex; Scott, H Morgan

    2012-02-01

    The effective control of an outbreak of a highly contagious disease such as foot-and-mouth disease (FMD) in the United States will require a strong partnership between the animal agriculture industry and the government. However, because of the diverse number of economic, social, and psychological influences affecting livestock producers, their complete cooperation during an outbreak may not be assured. We conducted interviews with 40 individuals involved in the Texas cattle industry in order to identify specific behaviors where producer participation or compliance may be reduced. Through qualitative analysis of these interviews, we identified specific factors which the participants suggested would influence producer behavior in regard to FMD detection and control. Using the Theory of Planned Behavior (TPB) as an initial guide, we developed an expanded theoretical framework in order to allow for the development of a questionnaire and further evaluation of the relative importance of the relationships indicated in the framework. A 2-day stakeholder workshop was used to develop and critique the final survey instruments. The behaviors which we identified where producer compliance may be reduced included requesting veterinary examination of cattle with clinical signs of FMD either before or during an outbreak of FMD, gathering and holding cattle at the date and time requested by veterinary authorities, and maintaining cattle in their current location during an outbreak of FMD. In addition, we identified additional factors which may influence producers' behavior including risk perception, trust in other producers and regulatory agencies, and moral norms. The theoretical frameworks presented in this paper can be used during an outbreak to assess barriers to and social pressures for producer compliance, prioritize the results in terms of their effects on behavior, and improve and better target risk communication strategies. PMID:21968089

  11. Quantitative photoacoustic imaging in radiative transport regime

    E-print Network

    Ren, Kui

    Quantitative photoacoustic imaging in radiative transport regime Alexander V. Mamonov Kui Ren July 6, 2012 Abstract The objective of quantitative photoacoustic tomography (QPAT) is to reconstruct. Key words. Quantitative photoacoustic tomography (QPAT), sectional photoacoustic tomogra- phy

  12. Focused Group Interviews as an Innovative Quanti-Qualitative Methodology (QQM): Integrating Quantitative Elements into a Qualitative Methodology

    ERIC Educational Resources Information Center

    Grim, Brian J.; Harmon, Alison H.; Gromis, Judy C.

    2006-01-01

    There is a sharp divide between quantitative and qualitative methodologies in the social sciences. We investigate an innovative way to bridge this gap that incorporates quantitative techniques into a qualitative method, the "quanti-qualitative method" (QQM). Specifically, our research utilized small survey questionnaires and experiment-like…

  13. Label-Free Quantitative Proteomics in Yeast.

    PubMed

    Léger, Thibaut; Garcia, Camille; Videlier, Mathieu; Camadro, Jean-Michel

    2016-01-01

    Label-free bottom-up shotgun MS-based proteomics is an extremely powerful and simple tool to provide high quality quantitative analyses of the yeast proteome with only microgram amounts of total protein. Although the experimental design of this approach is rather straightforward and does not require the modification of growth conditions, proteins or peptides, several factors must be taken into account to benefit fully from the power of this method. Key factors include the choice of an appropriate method for the preparation of protein extracts, careful evaluation of the instrument design and available analytical capabilities, the choice of the quantification method (intensity-based vs. spectral count), and the proper manipulation of the selected quantification algorithm. The elaboration of this robust workflow for data acquisition, processing, and analysis provides unprecedented insight into the dynamics of the yeast proteome. PMID:26483028

  14. Survey overview Instrument Construction

    E-print Network

    Sheridan, Jennifer

    #12;Outline Survey overview Instrument Construction Survey Logistics Response Rates Uses of Survey, to develop new initiatives for faculty on campus. University of Wisconsin Survey Center 630 W. Mifflin, Room. University of Wisconsin Survey Center 630 W. Mifflin, Room 174 Madison, WI 53703-2636 #12;Survey Overview

  15. Infrastructure Survey 2011

    ERIC Educational Resources Information Center

    Group of Eight (NJ1), 2012

    2012-01-01

    In 2011, the Group of Eight (Go8) conducted a survey on the state of its buildings and infrastructure. The survey is the third Go8 Infrastructure survey, with previous surveys being conducted in 2007 and 2009. The current survey updated some of the information collected in the previous surveys. It also collated data related to aspects of the…

  16. Geophex Airborne Unmanned Survey System

    SciTech Connect

    Won, I.L.; Keiswetter, D.

    1995-12-31

    Ground-based surveys place personnel at risk due to the proximity of buried unexploded ordnance (UXO) items or by exposure to radioactive materials and hazardous chemicals. The purpose of this effort is to design, construct, and evaluate a portable, remotely-piloted, airborne, geophysical survey system. This non-intrusive system will provide stand-off capability to conduct surveys and detect buried objects, structures, and conditions of interest at hazardous locations. During a survey, the operators remain remote from, but within visual distance of, the site. The sensor system never contacts the Earth, but can be positioned near the ground so that weak geophysical anomalies can be detected. The Geophex Airborne Unmanned Survey System (GAUSS) is designed to detect and locate small-scale anomalies at hazardous sites using magnetic and electromagnetic survey techniques. The system consists of a remotely-piloted, radio-controlled, model helicopter (RCH) with flight computer, light-weight geophysical sensors, an electronic positioning system, a data telemetry system, and a computer base-station. The report describes GAUSS and its test results.

  17. A survey of Sub-Saharan African medical schools

    PubMed Central

    2012-01-01

    Background Sub-Saharan Africa suffers a disproportionate share of the world's burden of disease while having some of the world's greatest health care workforce shortages. Doctors are an important component of any high functioning health care system. However, efforts to strengthen the doctor workforce in the region have been limited by a small number of medical schools with limited enrolments, international migration of graduates, poor geographic distribution of doctors, and insufficient data on medical schools. The goal of the Sub-Saharan African Medical Schools Study (SAMSS) is to increase the level of understanding and expand the baseline data on medical schools in the region. Methods The SAMSS survey is a descriptive survey study of Sub-Saharan African medical schools. The survey instrument included quantitative and qualitative questions focused on institutional characteristics, student profiles, curricula, post-graduate medical education, teaching staff, resources, barriers to capacity expansion, educational innovations, and external relationships with government and non-governmental organizations. Surveys were sent via e-mail to medical school deans or officials designated by the dean. Analysis is both descriptive and multivariable. Results Surveys were distributed to 146 medical schools in 40 of 48 Sub-Saharan African countries. One hundred and five responses were received (72% response rate). An additional 23 schools were identified after the close of the survey period. Fifty-eight respondents have been founded since 1990, including 22 private schools. Enrolments for medical schools range from 2 to 1800 and graduates range from 4 to 384. Seventy-three percent of respondents (n = 64) increased first year enrolments in the past five years. On average, 26% of respondents' graduates were reported to migrate out of the country within five years of graduation (n = 68). The most significant reported barriers to increasing the number of graduates, and improving quality, related to infrastructure and faculty limitations, respectively. Significant correlations were seen between schools implementing increased faculty salaries and bonuses, and lower percentage loss of faculty over the previous five years (P = 0.018); strengthened institutional research tools (P = 0.00015) and funded faculty research time (P = 0.045) and greater faculty involvement in research; and country compulsory service requirements (P = 0.039), a moderate number (1-5) of post-graduate medical education programs (P = 0.016) and francophone schools (P = 0.016) and greater rural general practice after graduation. Conclusions The results of the SAMSS survey increases the level of data and understanding of medical schools in Sub-Saharan Africa. This data serves as a baseline for future research, policies and investment in the health care workforce in the region which will be necessary for improving health. PMID:22364206

  18. Bachelor of Engineering in Surveying & Geospatial Engineering / Bachelor of Arts Program 3704

    E-print Network

    New South Wales, University of

    Bachelor of Engineering in Surveying & Geospatial Engineering / Bachelor of Arts Program 3704 (SAGE Project) GMAT2120 Surveying & Geospatial Technology CVEN2401 Sustainable Transpt & Hghwy Engg2500 Surveying Computations A CVEN3501 Water Resources Engineering CVEN4002 Design Practice A Arts

  19. MARSAME Evaluate the Survey Results 6 EVALUATE THE SURVEY RESULTS

    E-print Network

    tests always can be used to support the survey design in helping to ensure the quantity and quality of data meet the data quality objectives (DQOs) and measurement quality objectives (MQOs). Figure 6.1 illustrates the assessment phase of the data life cycle. 6.2 Conduct Data Quality Assessment Data quality

  20. Laser Surveying

    NASA Technical Reports Server (NTRS)

    1978-01-01

    NASA technology has produced a laser-aided system for surveying land boundaries in difficult terrain. It does the job more accurately than conventional methods, takes only one-third the time normally required, and is considerably less expensive. In surveying to mark property boundaries, the objective is to establish an accurate heading between two "corner" points. This is conventionally accomplished by erecting a "range pole" at one point and sighting it from the other point through an instrument called a theodolite. But how do you take a heading between two points which are not visible to each other, for instance, when tall trees, hills or other obstacles obstruct the line of sight? That was the problem confronting the U.S. Department of Agriculture's Forest Service. The Forest Service manages 187 million acres of land in 44 states and Puerto Rico. Unfortunately, National Forest System lands are not contiguous but intermingled in complex patterns with privately-owned land. In recent years much of the private land has been undergoing development for purposes ranging from timber harvesting to vacation resorts. There is a need for precise boundary definition so that both private owners and the Forest Service can manage their properties with confidence that they are not trespassing on the other's land.