Science.gov

Sample records for quantitative survey design

  1. Telephone Survey Designs.

    ERIC Educational Resources Information Center

    Casady, Robert J.

    The concepts, definitions, and notation that have evolved with the development of telephone survey design methodology are discussed and presented as a unified structure. This structure is then applied to some of the more well-known telephone survey designs and alternative designs are developed. The relative merits of the different survey designs…

  2. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  3. Survey design for detecting rare freshwater mussels

    USGS Publications Warehouse

    Smith, D.R.

    2006-01-01

    A common objective when surveying freshwater mussels is to detect the presence of rare populations. In certain situations, such as when endangered or threatened species are potentially in the area of a proposed impact, the survey should be designed to ensure a high probability of detecting species presence. Linking survey design to probability of detecting species presence has been done for quantitative surveys, but commonly applied designs that are based on timed searches have not made that connection. I propose a semiquantitative survey design that links search area and search efficiency to probability of detecting species presence. The survey can be designed to protect against failing to detect populations above a threshold abundance (or density). I illustrate the design for surveys to detect clubshell (Pluerobema clava) and northern riffleshell (Epioblasma torulosa rangiana) in the Allegheny River. Monte Carlo simulation indicated that the proposed survey design performs well under a range of spatial distributions and low densities (<0.05 m2) where search area is sufficient to ensure that the probability of detecting species presence is predicted to be ???0.85. ?? 2006 by The North American Benthological Society.

  4. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  5. 1998 construction design survey.

    PubMed

    Pinto, C

    1998-03-23

    Healthcare construction and design firms are discovering new worlds of opportunity--some domestic, some international and some unconventional. Emergency room renovation and construction work is drawing more attention, as are overseas projects and facilities designed with alternative medicine in mind. The number of projects rose 11% in 1997, but the overall cost of completed projects stayed flat compared with the previous year. PMID:10183405

  6. 1997 construction & design survey.

    PubMed

    Pinto, C

    1997-03-31

    Managed care might seem to be putting a damper on healthcare construction, but in fact it's one of several industry changes creating opportunities for architectural and design firms. One example of a trend toward making surroundings as pleasant as possible is the west campus expansion at East Texas Medical Center in Tyler (left). Designed and built by Ellerbe Becket and completed in 1995, the project, including a nine-story medical office building, features artwork and rooftop gardens. PMID:10165801

  7. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  8. Report on Solar Water Heating Quantitative Survey

    SciTech Connect

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  9. How To Design Surveys. The Survey Kit, Volume 5.

    ERIC Educational Resources Information Center

    Fink, Arlene

    The nine-volume Survey Kit is designed to help readers prepare and conduct surveys and become better users of survey results. All the books in the series contain instructional objectives, exercises and answers, examples of surveys in use, illustrations of survey questions, guidelines for action, checklists of "dos and don'ts," and annotated…

  10. RESOLVE and ECO: Survey Design

    NASA Astrophysics Data System (ADS)

    Kannappan, Sheila; Moffett, Amanda J.; Norris, Mark A.; Eckert, Kathleen D.; Stark, David; Berlind, Andreas A.; Snyder, Elaine M.; Norman, Dara J.; Hoversten, Erik A.; RESOLVE Team

    2016-01-01

    The REsolved Spectroscopy Of a Local VolumE (RESOLVE) survey is a volume-limited census of stellar, gas, and dynamical mass as well as star formation and galaxy interactions within >50,000 cubic Mpc of the nearby cosmic web, reaching down to dwarf galaxies of baryonic mass ~10^9 Msun and spanning multiple large-scale filaments, walls, and voids. RESOLVE is surrounded by the ~10x larger Environmental COntext (ECO) catalog, with matched custom photometry and environment metrics enabling analysis of cosmic variance with greater statistical power. For the ~1500 galaxies in its two equatorial footprints, RESOLVE goes beyond ECO in providing (i) deep 21cm data with adaptive sensitivity ensuring HI mass detections or upper limits <10% of the stellar mass and (ii) 3D optical spectroscopy including both high-resolution ionized gas or stellar kinematic data for each galaxy and broad 320-725nm spectroscopy spanning [OII] 3727, Halpha, and Hbeta. RESOLVE is designed to complement other radio and optical surveys in providing diverse, contiguous, and uniform local/global environment data as well as unusually high completeness extending into the gas-dominated dwarf galaxy regime. RESOLVE also offers superb reprocessed photometry including full, deep NUV coverage and synergy with other equatorial surveys as well as unique northern and southern facilities such as Arecibo, the GBT, and ALMA. The RESOLVE and ECO surveys have been supported by funding from NSF grants AST-0955368 and OCI-1156614.

  11. Watershed-based survey designs.

    PubMed

    Detenbeck, Naomi E; Cincotta, Dan; Denver, Judith M; Greenlee, Susan K; Olsen, Anthony R; Pitchford, Ann M

    2005-04-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs. PMID:15861987

  12. Watershed-based survey designs

    USGS Publications Warehouse

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs. ?? Springer Science + Business Media, Inc. 2005.

  13. Qualities of a Psychiatric Mentor: A Quantitative Singaporean Survey

    ERIC Educational Resources Information Center

    Tor, Phern-Chern; Goh, Lee-Gan; Ang, Yong-Guan; Lim, Leslie; Winslow, Rasaiah-Munidasa; Ng, Beng-Yeong; Wong, Sze-Tai; Ng, Tse-Pin; Kia, Ee-Heok

    2011-01-01

    Objective: Psychiatric mentors are an important part of the new, seamless training program in Singapore. There is a need to assess the qualities of a good psychiatric mentor vis-a-vis those of a good psychiatrist. Method: An anonymous survey was sent out to all psychiatry trainees and psychiatrists in Singapore to assess quantitatively the…

  14. Qualities of a Psychiatric Mentor: A Quantitative Singaporean Survey

    ERIC Educational Resources Information Center

    Tor, Phern-Chern; Goh, Lee-Gan; Ang, Yong-Guan; Lim, Leslie; Winslow, Rasaiah-Munidasa; Ng, Beng-Yeong; Wong, Sze-Tai; Ng, Tse-Pin; Kia, Ee-Heok

    2011-01-01

    Objective: Psychiatric mentors are an important part of the new, seamless training program in Singapore. There is a need to assess the qualities of a good psychiatric mentor vis-a-vis those of a good psychiatrist. Method: An anonymous survey was sent out to all psychiatry trainees and psychiatrists in Singapore to assess quantitatively the

  15. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  16. A quantitative model for designing keyboard layout.

    PubMed

    Shieh, K K; Lin, C C

    1999-02-01

    This study analyzed the quantitative relationship between keytapping times and ergonomic principles in typewriting skills. Keytapping times and key-operating characteristics of a female subject typing on the Qwerty and Dvorak keyboards for six weeks each were collected and analyzed. The results showed that characteristics of the typed material and the movements of hands and fingers were significantly related to keytapping times. The most significant factors affecting keytapping times were association frequency between letters, consecutive use of the same hand or finger, and the finger used. A regression equation for relating keytapping times to ergonomic principles was fitted to the data. Finally, a protocol for design of computerized keyboard layout based on the regression equation was proposed. PMID:10214637

  17. Quantitative three-dimensional low-speed wake surveys

    NASA Technical Reports Server (NTRS)

    Brune, G. W.

    1992-01-01

    Theoretical and practical aspects of conducting three-dimensional wake measurements in large wind tunnels are reviewed with emphasis on applications in low-speed aerodynamics. Such quantitative wake surveys furnish separate values for the components of drag, such as profile drag and induced drag, but also measure lift without the use of a balance. In addition to global data, details of the wake flowfield as well as spanwise distributions of lift and drag are obtained. The paper demonstrates the value of this measurement technique using data from wake measurements conducted by Boeing on a variety of low-speed configurations including the complex high-lift system of a transport aircraft.

  18. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  19. Statistical considerations in designing raptor surveys

    USGS Publications Warehouse

    Pendleton, G.W.

    1989-01-01

    Careful sampling design is required to obtain useful estimates of raptor abundance. Well-defined objectives, selection of appropriate sample units and sampling scheme, and attention to detail to reduce extraneous sources of variability and error are all important considerations in designing a raptor survey.

  20. Strategies for joint geophysical survey design

    NASA Astrophysics Data System (ADS)

    Shakas, Alexis; Maurer, Hansruedi

    2015-04-01

    In recent years, the use of multiple geophysical techniques to image the subsurface has become a popular option. Joint inversions of geophysical datasets are based on the assumption that the spatial variations of the different physical subsurface parameters exhibit structural similarities. In this work, we combine the benefits of joint inversions of geophysical datasets with recent innovations in optimized experimental design. These techniques maximize the data information content while minimizing the data acquisition costs. Experimental design has been used in geophysics over the last twenty years, but it has never been attempted to combine various geophysical imaging methods. We combine direct current geoelectrics, magnetotellurics and seismic refraction travel time tomography data to resolve synthetic 1D layered Earth models. An initial model for the subsurface structure can be taken from a priori geological information and an optimal joint geophysical survey can be designed around the initial model. Another typical scenario includes an existing data set from a past survey and a subsequent survey that is planned to optimally complement the existing data. Our results demonstrate that the joint design methodology provides optimized combinations of data sets that include only a few data points. Nevertheless, they allow constraining the subsurface models equally well as data from a densely sampled survey. Furthermore, we examine the dependency of optimized survey design on the a priori model assumptions. Finally, we apply the methodology to geoelectric and seismic field data collected along 2D profiles.

  1. Survey of adaptive control using Liapunov design

    NASA Technical Reports Server (NTRS)

    Lindorff, D. P.; Carroll, R. L.

    1972-01-01

    A survey was made of the literature devoted to the synthesis of model-tracking adaptive systems based on application of Liapunov's second method. The basic synthesis procedure is introduced and a critical review of extensions made to the theory since 1966 is made. The extensions relate to design for relative stability, reduction of order techniques, design with disturbance, design with time variable parameters, multivariable systems, identification, and an adaptive observer.

  2. Survey Design for Large-Scale, Unstructured Resistivity Surveys

    NASA Astrophysics Data System (ADS)

    Labrecque, D. J.; Casale, D.

    2009-12-01

    In this paper, we discuss the issues in designing data collection strategies for large-scale, poorly structured resistivity surveys. Existing or proposed applications for these types of surveys include carbon sequestration, enhanced oil recovery monitoring, monitoring of leachate from working or abandoned mines, and mineral surveys. Electrode locations are generally chosen by land access, utilities, roads, existing wells etc. Classical arrays such as the Wenner array or dipole-dipole arrays are not applicable if the electrodes cannot be placed in quasi-regular lines or grids. A new, far more generalized strategy is needed for building data collection schemes. Following the approach of earlier two-dimensional (2-D) survey designs, the proposed method begins by defining a base array. In (2-D) design, this base array is often a standard dipole-dipole array. For unstructured three-dimensional (3-D) design, determining this base array is a multi-step process. The first step is to determine a set of base dipoles with similar characteristics. For example, the base dipoles may consist of electrode pairs trending within 30 degrees of north and with a length between 100 and 250 m in length. These dipoles are then combined into a trial set of arrays. This trial set of arrays is reduced by applying a series of filters based on criteria such as separation between the dipoles. Using the base array set, additional arrays are added and tested to determine the overall improvement in resolution and to determine an optimal set of arrays. Examples of the design process are shown for a proposed carbon sequestration monitoring system.

  3. Ambulance Design Survey 2011: A Summary Report

    PubMed Central

    Lee, Y Tina; Kibira, Deogratias; Feeney, Allison Barnard; Marshall, Jennifer

    2013-01-01

    Current ambulance designs are ergonomically inefficient and often times unsafe for practical treatment response to medical emergencies. Thus, the patient compartment of a moving ambulance is a hazardous working environment. As a consequence, emergency medical services (EMS) workers suffer fatalities and injuries that far exceed those of the average work place in the United States. To reduce injury and mortality rates in ambulances, the Department of Homeland Security Science and Technology Directorate has teamed with the National Institute of Standards and Technology, the National Institute for Occupational Safety and Health, and BMT Designers & Planners in a joint project to produce science-based ambulance patient compartment design standards. This project will develop new crash-safety design standards and improved user-design interface guidance for patient compartments that are safer for EMS personnel and patients, and facilitate improved patient care. The project team has been working with practitioners, EMS workers’ organizations, and manufacturers to solicit needs and requirements to address related issues. This paper presents an analysis of practitioners’ concerns, needs, and requirements for improved designs elicited through the web-based survey of ambulance design, held by the National Institute of Standards and Technology. This paper also introduces the survey, analyzes the survey results, and discusses recommendations for future ambulance patient compartments design. PMID:26401439

  4. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  5. Spatially balanced survey designs for natural resources

    EPA Science Inventory

    Ecological resource monitoring programs typically require the use of a probability survey design to select locations or entities to be physically sampled in the field. The ecological resource of interest, the target population, occurs over a spatial domain and the sample selecte...

  6. Sample Design for Educational Survey Research.

    ERIC Educational Resources Information Center

    Ross, Kenneth N.

    1978-01-01

    Student's empirical sampling approach is used to assess the magnitude of the sampling errors of statistics describing a recursive causal model. The data were gathered with four complex sample designs commonly used in educational surveys. Jackknife and half-sample error estimates are applied to the data. (Author/CTM)

  7. 76 FR 27384 - Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    ... AFFAIRS Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys...). Type of Review: New collection. Abstract: VA's top priority is the prevention of Veterans suicide. It... better understand Veterans and their families' awareness of VA's suicide prevention and mental...

  8. Optimal design of focused experiments and surveys

    NASA Astrophysics Data System (ADS)

    Curtis, Andrew

    1999-10-01

    Experiments and surveys are often performed to obtain data that constrain some previously underconstrained model. Often, constraints are most desired in a particular subspace of model space. Experiment design optimization requires that the quality of any particular design can be both quantified and then maximized. This study shows how the quality can be defined such that it depends on the amount of information that is focused in the particular subspace of interest. In addition, algorithms are presented which allow one particular focused quality measure (from the class of focused measures) to be evaluated efficiently. A subclass of focused quality measures is also related to the standard variance and resolution measures from linearized inverse theory. The theory presented here requires that the relationship between model parameters and data can be linearized around a reference model without significant loss of information. Physical and financial constraints define the space of possible experiment designs. Cross-well tomographic examples are presented, plus a strategy for survey design to maximize information about linear combinations of parameters such as bulk modulus, κ =λ+ 2μ/3.

  9. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  10. Design of future surveys: chapter 13

    USGS Publications Warehouse

    Bart, Jonathan; Smith, Paul A.

    2012-01-01

    This brief chapter addresses two related issues: how effort should be allocated to different parts of the sampling plan and, given optimal allocation, how large a sample will be required to achieve the PRISM accuracy target. Simulations based on data collected to date showed that 2 plots per cluster on rapid surveys, 2 intensive camps per field crew-year, 2-4 intensive plots per intensive camp, and 2-3 rapid surveys per intensive plot is the most efficient allocation of resources. Using this design, we investigated how crew-years should be allocated to each region in order to meet the PRISM accuracy target most efficiently. The analysis indicated that 40-50 crew-years would achieve the accuracy target for 18-24 of the 26 species breeding widely in the Arctic. This analysis was based on assuming that two rounds of surveys were conducted and that a 50% decline occurred between them. We discuss the complexity of making these estimates and why they should be viewed as first approximations.

  11. Quantitative study designs used in quality improvement and assessment.

    PubMed

    Ormes, W S; Brim, M B; Coggan, P

    2001-01-01

    This article describes common quantitative design techniques that can be used to collect and analyze quality data. An understanding of the differences between these design techniques can help healthcare quality professionals make the most efficient use of their time, energies, and resources. To evaluate the advantages and disadvantages of these various study designs, it is necessary to assess factors that threaten the degree with which quality professionals may infer a cause-and-effect relationship from the data collected. Processes, the conduits of organizational function, often can be assessed by methods that do not take into account confounding and compromising circumstances that affect the outcomes of their analyses. An assumption that the implementation of process improvements may cause real change is incomplete without a consideration of other factors that might also have caused the same result. It is only through the identification, assessment, and exclusion of these alternative factors that administrators and healthcare quality professionals can assess the degree to which true process improvement or compliance has occurred. This article describes the advantages and disadvantages of common quantitative design techniques and reviews the corresponding threats to the interpretability of data obtained from their use. PMID:11378972

  12. Online Survey Design and Development: A Janus-Faced Approach

    ERIC Educational Resources Information Center

    Lauer, Claire; McLeod, Michael; Blythe, Stuart

    2013-01-01

    In this article we propose a "Janus-faced" approach to survey design--an approach that encourages researchers to consider how they can design and implement surveys more effectively using the latest web and database tools. Specifically, this approach encourages researchers to look two ways at once; attending to both the survey interface

  13. Online Survey Design and Development: A Janus-Faced Approach

    ERIC Educational Resources Information Center

    Lauer, Claire; McLeod, Michael; Blythe, Stuart

    2013-01-01

    In this article we propose a "Janus-faced" approach to survey design--an approach that encourages researchers to consider how they can design and implement surveys more effectively using the latest web and database tools. Specifically, this approach encourages researchers to look two ways at once; attending to both the survey interface…

  14. Survey of rural, private wells. Statistical design

    USGS Publications Warehouse

    Mehnert, Edward; Schock, Susan C.

    1991-01-01

    Half of Illinois' 38 million acres were planted in corn and soybeans in 1988. On the 19 million acres planted in corn and soybeans, approximately 1 million tons of nitrogen fertilizer and 50 million pounds of pesticides were applied. Because groundwater is the water supply for over 90 percent of rural Illinois, the occurrence of agricultural chemicals in groundwater in Illinois is of interest to the agricultural community, the public, and regulatory agencies. The occurrence of agricultural chemicals in groundwater is well documented. However, the extent of this contamination still needs to be defined. This can be done by randomly sampling wells across a geographic area. Key elements of a random, water-well sampling program for regional groundwater quality include the overall statistical design of the program, definition of the sample population, selection of wells to be sampled, and analysis of survey results. These elements must be consistent with the purpose for conducting the program; otherwise, the program will not provide the desired information. The need to carefully design and conduct a sampling program becomes readily apparent when one considers the high cost of collecting and analyzing a sample. For a random sampling program conducted in Illinois, the key elements, as well as the limitations imposed by available information, are described.

  15. Adaptive time-lapse optimized survey design for electrical resistivity tomography monitoring

    NASA Astrophysics Data System (ADS)

    Wilkinson, Paul B.; Uhlemann, Sebastian; Meldrum, Philip I.; Chambers, Jonathan E.; Carrière, Simon; Oxby, Lucy S.; Loke, M. H.

    2015-10-01

    Adaptive optimal experimental design methods use previous data and results to guide the choice and design of future experiments. This paper describes the formulation of an adaptive survey design technique to produce optimal resistivity imaging surveys for time-lapse geoelectrical monitoring experiments. These survey designs are time-dependent and, compared to dipole-dipole or static optimized surveys that do not change over time, focus a greater degree of the image resolution on regions of the subsurface that are actively changing. The adaptive optimization method is validated using a controlled laboratory monitoring experiment comprising a well-defined cylindrical target moving along a trajectory that changes its depth and lateral position. The algorithm is implemented on a standard PC in conjunction with a modified automated multichannel resistivity imaging system. Data acquisition using the adaptive survey designs requires no more time or power than with comparable standard surveys, and the algorithm processing takes place while the system batteries recharge. The results show that adaptively designed optimal surveys yield a quantitative increase in image quality over and above that produced by using standard dipole-dipole or static (time-independent) optimized surveys.

  16. 76 FR 9637 - Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ... AFFAIRS Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity... outreach efforts on the prevention of suicide among Veterans and their families. DATES: Written comments...). Type of Review: New collection. Abstract: VA's top priority is the prevention of Veterans suicide....

  17. Quantitative survey on the shape of the back of men's head as viewed from the side.

    PubMed

    Tamir, Abraham

    2013-05-01

    This article classifies quantitatively into 4 shapes men's back part of the head viewed from the side that are demonstrated in some of the figures in this article. Because of self-evident reasons, the shapes were blurred. The survey is based on the analysis of 2220 shapes obtained by photographing mainly bald men and by finding pictures in the Internet. To the best of the author's knowledge, this quantitative approach has never been implemented before. The results obtained are as follows: the percentage of 376 "flat heads" is 17%; the percentage of 755 "little round heads," 34%; the percentage of 1017 "round heads," 45.8%; and the percentage of 72 "very round heads," 3.2%. This quantitative survey is an additional step in analyzing quantitatively the shape of the parts of the face wherein, in articles that were previously published or that will be published in this magazine, shapes of the nose, ear conch, and human eye were analyzed quantitatively. In addition, the shapes of the leg toes were also analyzed. Finally, it should be noted that, because of obvious reasons, the survey is based on men's head, most of which are with baldness. PMID:23714907

  18. Design Effects and the Analysis of Survey Data.

    ERIC Educational Resources Information Center

    Folsom, Ralph E.; Williams, Rick L.

    The National Assessment of Educational Progress (NAEP), like most large national surveys, employs a complex stratified multistage unequal probability sample. The design provides a rigorous justification for extending survey results to the entire U.S. target population. Developments in the analysis of data from complex surveys which provide a…

  19. Design and Validation of the Quantum Mechanics Conceptual Survey

    ERIC Educational Resources Information Center

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included…

  20. Design and Validation of the Quantum Mechanics Conceptual Survey

    ERIC Educational Resources Information Center

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included

  1. A Method for Designing Instrument-Free Quantitative Immunoassays.

    PubMed

    Lathwal, Shefali; Sikes, Hadley D

    2016-03-15

    Colorimetric readouts are widely used in point-of-care diagnostic immunoassays to indicate either the presence or the absence of an analyte. For a variety of reasons, it is more difficult to quantify rather than simply detect an analyte using a colorimetric test. We report a method for designing, with minimal iteration, a quantitative immunoassay that can be interpreted objectively by a simple count of number of spots visible to the unaided eye. We combined a method called polymerization-based amplification (PBA) with a series of microscale features containing a decreasing surface density of capture molecules, and the central focus of the study is understanding how the choice of surface densities impacts performance. Using a model pair of antibodies, we have shown that our design approach does not depend on measurement of equilibrium and kinetic binding parameters and can provide a dynamic working range of 3 orders of magnitude (70 pM to 70 nM) for visual quantification. PMID:26878154

  2. Research on Basic Design Education: An International Survey

    ERIC Educational Resources Information Center

    Boucharenc, C. G.

    2006-01-01

    This paper reports on the results of a survey and qualitative analysis on the teaching of "Basic Design" in schools of design and architecture located in 22 countries. In the context of this research work, Basic Design means the teaching and learning of design fundamentals that may also be commonly referred to as the Principles of Two- and…

  3. Designing community surveys to provide a basis for noise policy

    NASA Technical Reports Server (NTRS)

    Fields, J. M.

    1980-01-01

    After examining reports from a large number of social surveys, two areas were identified where methodological improvements in the surveys would be especially useful for public policy. The two study areas are: the definition of noise indexes and the assessment of noise impact. Improvements in the designs of surveys are recommended which would increase the validity and reliability of the noise indexes. Changes in interview questions and sample designs are proposed which would enable surveys to provide measures of noise impact which are directly relevant for public policy.

  4. A quantitative approach to nonlinear IC process design rule scaling

    NASA Astrophysics Data System (ADS)

    Gold, Spencer Montgomery

    As minimum dimensions in integrated circuit technologies are reduced beyond 0.1 m m, linear process scaling becomes more difficult and costly. Exponentially rising manufacturing facility and process scaling costs can be better managed by performing nonlinear process shrinks. Nonlinear scaling allows the horizontal design rules to be reduced by different factors according to their ability to provide area and performance improvement in a cost effective manner. This thesis describes a methodology and CAD tools for use in selecting nonlinear design rule reduction ratios that make effective tradeoffs between die cost and performance. The cost effectiveness of nonlinear scaling is demonstrated for a complementary GaAs (CGaAsTM) process. CGaAs is a young technology with coarse design rules that would benefit significantly from a nonlinear shrink. The cost/benefit analysis for scaling the design rules is based on a process-independent optimizing SRAM compiler which was developed as part of this work. The methodology for nonlinear scaling includes identifying the rules which have the greatest impact on circuit area and analyzing the area and performance improvements as these rules are scaled through a range of practical scale factors. Benefit data (product of power and delay improvement ratios) is then combined with die cost estimates at each step to yield the cost/benefit ratio, a quantitative metric for design rule reduction. The slopes and inflection points of cost/benefit vs. scale factor plots guide process engineers in selecting reduction ratios for the various design rules. This procedure should be repeated, using the results of one pass as the starting point for the next. The cost/benefit analysis methodology compares embedded static RAMs that are generated by the PUMA process-independent SRAM compiler. This compiler, which is based on Duet's MasterPortTM layout compactor, can create optimized SRAM cell libraries for any complementary technology. It is capable of exploring a large design space, including the ability to adjust the transistors within the six-transistor memory cell. It produces power-delay curves that are combined with SRAM area measurements to provide the power, delay, and area data required for a cost/benefit analysis. A 0.5 m m CGaAs process is analyzed to demonstrate the methodology. A cost/benefit analysis of the design rules shows that the first scaling step should include a reduction of at least four rules: minimum transistor width, source/drain ohmic width, ohmic contact width, and active overlap of contact. The proportion by which these rules should be reduced depends on the number of wafers over which the scaling costs are amortized, and ranges from 20 to 40%. A similar analysis of the effect of transistor threshold voltage reduction clearly showed diminishing cost/benefit and cost/delay returns for an embedded SRAM.

  5. Quantitative Trait Loci (QTL) Detection in Multicross Inbred Designs

    PubMed Central

    Crepieux, Sébastien; Lebreton, Claude; Servin, Bertrand; Charmet, Gilles

    2004-01-01

    Mapping quantitative trait loci in plants is usually conducted using a population derived from a cross between two inbred lines. The power of such QTL detection and the parameter estimates depend largely on the choice of the two parental lines. Thus, the QTL detected in such populations represent only a small part of the genetic architecture of the trait. In addition, the effects of only two alleles are characterized, which is of limited interest to the breeder, while common pedigree breeding material remains unexploited for QTL mapping. In this study, we extend QTL mapping methodology to a generalized framework, based on a two-step IBD variance component approach, applicable to any type of breeding population obtained from inbred parents. We then investigate with simulated data mimicking conventional breeding programs the influence of different estimates of the IBD values on the power of QTL detection. The proposed method would provide an alternative to the development of specifically designed recombinant populations, by utilizing the genetic variation actually managed by plant breeders. The use of these detected QTL in assisting breeding would thus be facilitated. PMID:15579720

  6. Designing occupancy studies: general advice and allocating survey effort

    USGS Publications Warehouse

    MacKenzie, D.I.; Royle, J. Andrew

    2005-01-01

    1. The fraction of sampling units in a landscape where a target species is present (occupancy) is an extensively used concept in ecology. Yet in many applications the species will not always be detected in a sampling unit even when present, resulting in biased estimates of occupancy. Given that sampling units are surveyed repeatedly within a relatively short timeframe, a number of similar methods have now been developed to provide unbiased occupancy estimates. However, practical guidance on the efficient design of occupancy studies has been lacking. 2. In this paper we comment on a number of general issues related to designing occupancy studies, including the need for clear objectives that are explicitly linked to science or management, selection of sampling units, timing of repeat surveys and allocation of survey effort. Advice on the number of repeat surveys per sampling unit is considered in terms of the variance of the occupancy estimator, for three possible study designs. 3. We recommend that sampling units should be surveyed a minimum of three times when detection probability is high (> 0.5 survey-1), unless a removal design is used. 4. We found that an optimal removal design will generally be the most efficient, but we suggest it may be less robust to assumption violations than a standard design. 5. Our results suggest that for a rare species it is more efficient to survey more sampling units less intensively, while for a common species fewer sampling units should be surveyed more intensively. 6. Synthesis and applications. Reliable inferences can only result from quality data. To make the best use of logistical resources, study objectives must be clearly defined; sampling units must be selected, and repeated surveys timed appropriately; and a sufficient number of repeated surveys must be conducted. Failure to do so may compromise the integrity of the study. The guidance given here on study design issues is particularly applicable to studies of species occurrence and distribution, habitat selection and modelling, metapopulation studies and monitoring programmes.

  7. Survey of adaptive control using Liapunov design

    NASA Technical Reports Server (NTRS)

    Lindorff, D. P.; Carroll, R. L.

    1973-01-01

    A survey of the literature in which Liapunov's second method is used in determining the control law is presented, with emphasis placed on the model-tracking adaptive control problem. Forty references are listed. Following a brief tutorial exposition of the adaptive control problem, the techniques for treating reduction of order, disturbance and time-varying parameters, multivariable systems, identification, and adaptive observers are discussed. The method is critically evaluated, particularly with respect to possibilities for application.

  8. Designing surveys for tests of gravity.

    PubMed

    Jain, Bhuvnesh

    2011-12-28

    Modified gravity theories may provide an alternative to dark energy to explain cosmic acceleration. We argue that the observational programme developed to test dark energy needs to be augmented to capture new tests of gravity on astrophysical scales. Several distinct signatures of gravity theories exist outside the 'linear' regime, especially owing to the screening mechanism that operates inside halos such as the Milky Way to ensure that gravity tests in the solar system are satisfied. This opens up several decades in length scale and classes of galaxies at low redshift that can be exploited by surveys. While theoretical work on models of gravity is in the early stages, we can already identify new regimes that cosmological surveys could target to test gravity. These include: (i) a small-scale component that focuses on the interior and vicinity of galaxy and cluster halos, (ii) spectroscopy of low-redshift galaxies, especially galaxies smaller than the Milky Way, in environments that range from voids to clusters, and (iii) a programme of combining lensing and dynamical information, from imaging and spectroscopic surveys, respectively, on the same (or statistically identical) sample of galaxies. PMID:22084295

  9. Variance estimation for systematic designs in spatial surveys.

    PubMed

    Fewster, R M

    2011-12-01

    In spatial surveys for estimating the density of objects in a survey region, systematic designs will generally yield lower variance than random designs. However, estimating the systematic variance is well known to be a difficult problem. Existing methods tend to overestimate the variance, so although the variance is genuinely reduced, it is over-reported, and the gain from the more efficient design is lost. The current approaches to estimating a systematic variance for spatial surveys are to approximate the systematic design by a random design, or approximate it by a stratified design. Previous work has shown that approximation by a random design can perform very poorly, while approximation by a stratified design is an improvement but can still be severely biased in some situations. We develop a new estimator based on modeling the encounter process over space. The new "striplet" estimator has negligible bias and excellent precision in a wide range of simulation scenarios, including strip-sampling, distance-sampling, and quadrat-sampling surveys, and including populations that are highly trended or have strong aggregation of objects. We apply the new estimator to survey data for the spotted hyena?(Crocuta crocuta) in the Serengeti National Park, Tanzania, and find that the reported coefficient of variation for estimated density is 20% using approximation by a random design, 17% using approximation by a stratified design, and 11% using the new striplet estimator. This large reduction in reported variance is verified by simulation. PMID:21534940

  10. A survey of spacecraft thermal design solutions

    NASA Technical Reports Server (NTRS)

    Humphries, R.; Wegrich, R.; Pierce, E.; Patterson, W.

    1991-01-01

    A number of thermal projects are outlined giving a perspective on the scope and depth of activities in the thermal control group. A set of designs are presented in a form to illustrate some of the more innovative work. Design configurations, solution techniques, and flight anomalies are discussed. Activities include the instruments of the Hubble Space Telescope, Space Station Freedom, and Spacelab.

  11. Use of multispectral data in design of forest sample surveys

    NASA Technical Reports Server (NTRS)

    Titus, S. J.; Wensel, L. C.

    1977-01-01

    The use of multispectral data in design of forest sample surveys using a computer software package is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.

  12. 7. Historic American Buildings Survey ORIGINAL DESIGN SUBMITTED BY PEABODY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. Historic American Buildings Survey ORIGINAL DESIGN SUBMITTED BY PEABODY AND STEARNS (FROM THE ORIGINAL IN THE LIBRARY OF THE VOLTA BUREAU) - Volta Bureau, 1537 Thirty-fifth Street Northwest, Washington, District of Columbia, DC

  13. The Dark Energy Survey instrument design

    SciTech Connect

    Flaugher, B.; /Fermilab

    2006-05-01

    We describe a new project, the Dark Energy Survey (DES), aimed at measuring the dark energy equation of state parameter, w, to a statistical precision of {approx}5%, with four complementary techniques. The survey will use a new 3 sq. deg. mosaic camera (DECam) mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic camera, a five element optical corrector, four filters (g,r,i,z), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27''/pixel) arranged in a hexagon inscribed within the 2.2 deg. diameter field of view. We plan to use the 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). At Fermilab, we will establish a packaging factory to produce four-side buttable modules for the LBNL devices, as well as to test and grade the CCDs. R&D is underway and delivery of DECam to CTIO is scheduled for 2009.

  14. Sloan Digital Sky Survey imaging camera: design and performance

    NASA Astrophysics Data System (ADS)

    Rockosi, Constance M.; Gunn, James E.; Carr, Michael A.; Sekiguchi, Masaki; Ivezic, Zeljko; Munn, Jeffrey A.

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) imaging camera saw first light in May 1998, and has been in regular operation since the start of the survey in April 2000. We review here key elements in the design of the instrument driven by the specific goals of the survey, and discuss some of the operational issues involved in keeping the instrument ready to observe at all times and in monitoring its performance. We present data on the mechanical and photometric stability of the camera, using on-sky survey data as collected and processed to date.

  15. Sample design for the residential energy consumption survey

    SciTech Connect

    Not Available

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  16. Optical Design for a Survey X-Ray Telescope

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.

    2014-01-01

    Optical design trades are underway at the Goddard Space Flight Center to define a telescope for an x-ray survey mission. Top-level science objectives of the mission include the study of x-ray transients, surveying and long-term monitoring of compact objects in nearby galaxies, as well as both deep and wide-field x-ray surveys. In this paper we consider Wolter, Wolter-Schwarzschild, and modified Wolter-Schwarzschild telescope designs as basic building blocks for the tightly nested survey telescope. Design principles and dominating aberrations of individual telescopes and nested telescopes are discussed and we compare the off-axis optical performance at 1.0 KeV and 4.0 KeV across a 1.0-degree full field-of-view.

  17. Survey of quantitative data on the solar energy and its spectra distribution

    NASA Technical Reports Server (NTRS)

    Thekaekara, M. P.

    1976-01-01

    This paper presents a survey of available quantitative data on the total and spectral solar irradiance at ground level and outside the atmosphere. Measurements from research aircraft have resulted in the currently accepted NASA/ASTM standards of the solar constant and zero air mass solar spectral irradiance. The intrinsic variability of solar energy output and programs currently under way for more precise measurements from spacecraft are discussed. Instrumentation for solar measurements and their reference radiation scales are examined. Insolation data available from the records of weather stations are reviewed for their applicability to solar energy conversion. Two alternate methods of solarimetry are briefly discussed.

  18. Bird community as an indicator of biodiversity: results from quantitative surveys in Brazil.

    PubMed

    Vielliard, J M

    2000-09-01

    This short review presents the results obtained in several localities of Brazil on the composition of forest bird communities. Data were collected since the late 80's, after we introduced a new methodology of quantitative survey, based on acoustic identification and unlimited-radius point census. Although these data are still scattered, they show uniquely precise and coherently comparative patterns of composition of forest bird communities. Our methodology has the advantage of being absolutely non-disturbing, highly efficient in the field and immediately processed. Results confirm that the structure of a bird community is a good indicator of biodiversity, particularly useful where biodiversity is high. Many of these data are available only in unpublished dissertations and abstracts of congress communications, or are being analysed. A cooperative program is needed to promote new surveys and publish their results, as a contribution for measuring and monitoring biodiversity, especially in complex endangered habitats. PMID:11028097

  19. Designer substrate library for quantitative, predictive modeling of reaction performance

    PubMed Central

    Bess, Elizabeth N.; Bischoff, Amanda J.; Sigman, Matthew S.

    2014-01-01

    Assessment of reaction substrate scope is often a qualitative endeavor that provides general indications of substrate sensitivity to a measured reaction outcome. Unfortunately, this field standard typically falls short of enabling the quantitative prediction of new substrates’ performance. The disconnection between a reaction’s development and the quantitative prediction of new substrates’ behavior limits the applicative usefulness of many methodologies. Herein, we present a method by which substrate libraries can be systematically developed to enable quantitative modeling of reaction systems and the prediction of new reaction outcomes. Presented in the context of rhodium-catalyzed asymmetric transfer hydrogenation, these models quantify the molecular features that influence enantioselection and, in so doing, lend mechanistic insight to the modes of asymmetric induction. PMID:25267648

  20. Mobile Libraries, Design and Construction: A Survey of Current Practice.

    ERIC Educational Resources Information Center

    Eastwood, C. R.; And Others

    Forty-one country libraries in Wales, Scotland and England were surveyed in 1970 in an attempt to establish current practice in the design and construction of mobile libraries. This report is the first step of the Branch and Mobile Libraries Group of the Library Association to establish standards for mobile library design and construction. The…

  1. A survey of spacecraft thermal design solutions

    NASA Technical Reports Server (NTRS)

    Humphries, R.; Wegrich, R.; Pierce, E.; Patterson, W.

    1991-01-01

    A review of activities at the NASA/Marshall Space Flight Center in the heat transfer and thermodynamics disciplines as well as attendant fluid mechanics, transport phenomena, and computer science applications is presented. Attention is focused on recent activities including the Hubble Space Telescope, and large space instruments, particularly telescope thermal control systems such as those flown aboard Spacelab 2 and the Astro missions. Emphasis is placed on defining the thermal control features, unique design schemes, and performance of selected programs. Results obtained both by ground testing and analytical means, as well as flight and postflight data are presented.

  2. Quantitative research for virtual mobile robot design cycle based on PERT

    NASA Astrophysics Data System (ADS)

    Wang, Jingkun; Ai, Xing; Zhang, Jinsheng; Chong, Kil To

    2010-01-01

    75% to 80% of the mobile robot cost is determined by robots design, and to a large extent the efficiency of the production of robots and their performance are also determined. Quantitative evaluation of the virtual design cycle has been plagued robot designers. In this paper, CBD technique is used to design the mobile robots. Program Evaluation and Review Technique (PERT) is used to compare quantitatively the design cycle between virtual design and traditional design. By way of comparison between the virtual design and traditional design used in the design process for the mobile robots, virtual design cycle was shortened the design cycle by more than 40 percent. And at the circumstances of same design cycle, the completion probability of using virtual design is much higher than the designs of using traditional design.

  3. Quantitative research for virtual mobile robot design cycle based on PERT

    NASA Astrophysics Data System (ADS)

    Wang, Jingkun; Ai, Xing; Zhang, Jinsheng; Chong, Kil To

    2009-12-01

    75% to 80% of the mobile robot cost is determined by robots design, and to a large extent the efficiency of the production of robots and their performance are also determined. Quantitative evaluation of the virtual design cycle has been plagued robot designers. In this paper, CBD technique is used to design the mobile robots. Program Evaluation and Review Technique (PERT) is used to compare quantitatively the design cycle between virtual design and traditional design. By way of comparison between the virtual design and traditional design used in the design process for the mobile robots, virtual design cycle was shortened the design cycle by more than 40 percent. And at the circumstances of same design cycle, the completion probability of using virtual design is much higher than the designs of using traditional design.

  4. Hemostatic assessment, treatment strategies, and hematology consultation in massive postpartum hemorrhage: results of a quantitative survey of obstetrician-gynecologists

    PubMed Central

    James, Andra H; Cooper, David L; Paidas, Michael J

    2015-01-01

    Objective To assess potential diagnostic and practice barriers to successful management of massive postpartum hemorrhage (PPH), emphasizing recognition and management of contributing coagulation disorders. Study design A quantitative survey was conducted to assess practice patterns of US obstetrician-gynecologists in managing massive PPH, including assessment of coagulation. Results Nearly all (98%) of the 50 obstetrician-gynecologists participating in the survey reported having encountered at least one patient with “massive” PPH in the past 5 years. Approximately half (52%) reported having previously discovered an underlying bleeding disorder in a patient with PPH, with disseminated intravascular coagulation (88%, n=23/26) being identified more often than von Willebrand disease (73%, n=19/26). All reported having used methylergonovine and packed red blood cells in managing massive PPH, while 90% reported performing a hysterectomy. A drop in blood pressure and ongoing visible bleeding were the most commonly accepted indications for rechecking a “stat” complete blood count and coagulation studies, respectively, in patients with PPH; however, 4% of respondents reported that they would not routinely order coagulation studies. Forty-two percent reported having never consulted a hematologist for massive PPH. Conclusion The survey findings highlight potential areas for improved practice in managing massive PPH, including earlier and more consistent assessment, monitoring of coagulation studies, and consultation with a hematologist. PMID:26604829

  5. Quantitative label-free phosphoproteomics strategy for multifaceted experimental designs.

    PubMed

    Soderblom, Erik J; Philipp, Melanie; Thompson, J Will; Caron, Marc G; Moseley, M Arthur

    2011-05-15

    Protein phosphorylation is a critical regulator of signaling in nearly all eukaryotic cellular pathways and dysregulated phosphorylation has been implicated in an array of diseases. The majority of MS-based quantitative phosphorylation studies are currently performed from transformed cell lines because of the ability to generate large amounts of starting material with incorporated isotopically labeled amino acids during cell culture. Here we describe a general label-free quantitative phosphoproteomic strategy capable of directly analyzing relatively small amounts of virtually any biological matrix, including human tissue and biological fluids. The strategy utilizes a TiO(2) enrichment protocol in which the selectivity and recovery of phosphopeptides were optimized by assessing a twenty-point condition matrix of binding modifier concentrations and peptide-to-resin capacity ratios. The quantitative reproducibility of the TiO(2) enrichment was determined to be 16% RSD through replicate enrichments of a wild-type Danio rerio (zebrafish) lysate. Measured phosphopeptide fold-changes from alpha-casein spiked into wild-type zebrafish lysate backgrounds were within 5% of the theoretical value. Application to a morpholino induced knock-down of G protein-coupled receptor kinase 5 (GRK5) in zebrafish embryos resulted in the quantitation of 719 phosphorylated peptides corresponding to 449 phosphorylated proteins from 200 μg of zebrafish embryo lysates. PMID:21491946

  6. A Quantitative Approach to the Design of School Bus Routes.

    ERIC Educational Resources Information Center

    Tracz, George S.

    A number of factors--including the reorganization of school administrative structures, the availability of new technology, increased competition among groups for limited resources, and changing patterns of communication--suggest an increased need for quantitative analysis in the school district decision-making process. One area of school…

  7. Quantitative Label-Free Phosphoproteomics Strategy for Multifaceted Experimental Designs

    PubMed Central

    2011-01-01

    Protein phosphorylation is a critical regulator of signaling in nearly all eukaryotic cellular pathways and dysregulated phosphorylation has been implicated in an array of diseases. The majority of MS-based quantitative phosphorylation studies are currently performed from transformed cell lines because of the ability to generate large amounts of starting material with incorporated isotopically labeled amino acids during cell culture. Here we describe a general label-free quantitative phosphoproteomic strategy capable of directly analyzing relatively small amounts of virtually any biological matrix, including human tissue and biological fluids. The strategy utilizes a TiO2 enrichment protocol in which the selectivity and recovery of phosphopeptides were optimized by assessing a twenty-point condition matrix of binding modifier concentrations and peptide-to-resin capacity ratios. The quantitative reproducibility of the TiO2 enrichment was determined to be 16% RSD through replicate enrichments of a wild-type Danio rerio (zebrafish) lysate. Measured phosphopeptide fold-changes from alpha-casein spiked into wild-type zebrafish lysate backgrounds were within 5% of the theoretical value. Application to a morpholino induced knock-down of G protein-coupled receptor kinase 5 (GRK5) in zebrafish embryos resulted in the quantitation of 719 phosphorylated peptides corresponding to 449 phosphorylated proteins from 200 μg of zebrafish embryo lysates. PMID:21491946

  8. Survey of Campylobacter jejuni in retail chicken meat products by application of a quantitative PCR protocol.

    PubMed

    Rantsiou, Kalliopi; Lamberti, Cristina; Cocolin, Luca

    2010-07-31

    Campylobacter-contaminated food products are currently the cause of the highest number of gastroenteritis cases in developed countries. Apart for biosafety measures at the primary production level, no other official control measures are currently in place for its control. This is partly due to the lack of quantitative data regarding the prevalence and contamination level of different food products by Campylobacter spp. that does not allow for quantitative risk assessment. PCR-based methods, applied without prior enrichment, in food samples circumvent limitations associated with the quantification of foodborne pathogens by traditional, culture-dependent methods. In this study, we report the development of a protocol, based on the amplification of the rpoB gene of Campylobacter jejuni, by quantitative PCR (qPCR), directly in food samples. The quantification limit of the protocol was determined to be in the order of 10 colony forming units (cfu)/g or ml of food sample. The optimized protocol was applied for the survey of C. jejuni in naturally contaminated poultry samples. In parallel, traditional sampling was also performed. A high percentage of samples (87%) resulted to be positive by qPCR, while no C. jejuni was detected by traditional analysis. Furthermore, important differences were observed in the detection by qPCR between samples before and after enrichment. PMID:20207040

  9. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Ly{alpha} NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    SciTech Connect

    Prescott, Moire K. M.; Dey, Arjun; Jannuzi, Buell T.

    2012-04-01

    Giant Ly{alpha} nebulae (or Ly{alpha} 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Ly{alpha} nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Ly{alpha} nebulae at 2 {approx}< z {approx}< 3 within deep broadband imaging and have carried out a survey of the 9.4 deg{sup 2} NOAO Deep Wide-Field Survey Booetes field. With a total survey comoving volume of Almost-Equal-To 10{sup 8} h{sup -3}{sub 70} Mpc{sup 3}, this is the largest volume survey for Ly{alpha} nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Ly{alpha} nebula.

  10. Multidisciplinary aerospace design optimization: Survey of recent developments

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1995-01-01

    The increasing complexity of engineering systems has sparked increasing interest in multidisciplinary optimization (MDO). This paper presents a survey of recent publications in the field of aerospace where interest in MDO has been particularly intense. The two main challenges of MDO are computational expense and organizational complexity. Accordingly the survey is focussed on various ways different researchers use to deal with these challenges. The survey is organized by a breakdown of MDO into its conceptual components. Accordingly, the survey includes sections on Mathematical Modeling, Design-oriented Analysis, Approximation Concepts, Optimization Procedures, System Sensitivity, and Human Interface. With the authors' main expertise being in the structures area, the bulk of the references focus on the interaction of the structures discipline with other disciplines. In particular, two sections at the end focus on two such interactions that have recently been pursued with a particular vigor: Simultaneous Optimization of Structures and Aerodynamics, and Simultaneous Optimization of Structures Combined With Active Control.

  11. Design and Architecture of Collaborative Online Communities: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Aviv, Reuven; Erlich, Zippy; Ravid, Gilad

    2004-01-01

    This paper considers four aspects of online communities. Design, mechanisms, architecture, and the constructed knowledge. We hypothesize that different designs of communities drive different mechanisms, which give rise to different architectures, which in turn result in different levels of collaborative knowledge construction. To test this chain

  12. Simulating future uncertainty to guide the selection of survey designs for long-term monitoring

    USGS Publications Warehouse

    Garman, Steven L.; Schweiger, E. William; Manier, Daniel J.

    2012-01-01

    A goal of environmental monitoring is to provide sound information on the status and trends of natural resources (Messer et al. 1991, Theobald et al. 2007, Fancy et al. 2009). When monitoring observations are acquired by measuring a subset of the population of interest, probability sampling as part of a well-constructed survey design provides the most reliable and legally defensible approach to achieve this goal (Cochran 1977, Olsen et al. 1999, Schreuder et al. 2004; see Chapters 2, 5, 6, 7). Previous works have described the fundamentals of sample surveys (e.g. Hansen et al. 1953, Kish 1965). Interest in survey designs and monitoring over the past 15 years has led to extensive evaluations and new developments of sample selection methods (Stevens and Olsen 2004), of strategies for allocating sample units in space and time (Urquhart et al. 1993, Overton and Stehman 1996, Urquhart and Kincaid 1999), and of estimation (Lesser and Overton 1994, Overton and Stehman 1995) and variance properties (Larsen et al. 1995, Stevens and Olsen 2003) of survey designs. Carefully planned, “scientific” (Chapter 5) survey designs have become a standard in contemporary monitoring of natural resources. Based on our experience with the long-term monitoring program of the US National Park Service (NPS; Fancy et al. 2009; Chapters 16, 22), operational survey designs tend to be selected using the following procedures. For a monitoring indicator (i.e. variable or response), a minimum detectable trend requirement is specified, based on the minimum level of change that would result in meaningful change (e.g. degradation). A probability of detecting this trend (statistical power) and an acceptable level of uncertainty (Type I error; see Chapter 2) within a specified time frame (e.g. 10 years) are specified to ensure timely detection. Explicit statements of the minimum detectable trend, the time frame for detecting the minimum trend, power, and acceptable probability of Type I error (α) collectively form the quantitative sampling objective.

  13. Acoustical Surveys of Methane Plumes by Using the Quantitative Echo Sounder in Japan Sea

    NASA Astrophysics Data System (ADS)

    Aoyama, C.; Matsumoto, R.; Hiruta, A.; Machiyama, H.; Numanami, H.; Tomaru, H.; Snyder, G.; Hiromatsu, M.; Igeta, Y.; Freitas, L.

    2006-12-01

    R&T/V Umitaka-maru(Tokyo Univ. of Marine Science and Technology) and R/V Natsushima(JAMSTEC) sailed to the methane seep area on a small ridge in the Naoetsu Basin, in the eastern margin of the Sea of Japan on July 2004 and July 2005 and July 2006 to survey the gas hydrate in the ocean floor and related acoustic signatures of methane plumes by using a quantitative echo sounder. Detailed bathymetric profiles have revealed a number of mounds, pockmarks and collapsed structures within 3km x 4km on the ridge at the water depth of 910m to 980m. We minutely mapped methane plumes by using a quantitative echo sounder (frequency is 38 kHz, beam width is -19.1dB) with positioning data from GPS. The vessels sailed at intervals of 0.05 nmi, and their speed was under 3kt. We also measured averaged echo intensity from the methane plumes and sea bottoms both in every 100m range and every one minute by the echo integrator. We obtained the following results from the present echo-sounder survey. 1) We mapped in detail the methane plumes and the seep areas. There are over pockmark-mound zone. 2) For the survey in 2005, we checked several methane plumes on echogram in another area included in the survey conducted in 2004. 3) Average volume backscattering strength (SV) of each methane plume tends to be related to water temperature and water pressure. The hydrate bubbles float upward until they reach warm waters at 300m depth. The gas volume abruptly increases at this point as the hydrate coating melts. 4) We recovered several fist-sized chunks of methane hydrate by piston coring at the area where we observed the methane plumes. As a following up project, we are planning 1) to measure the SV of methane bubbles and methane hydrate floating in water columns by using the submarine vehicle, called Hyper Dolphin, 2) to make a trial calculation of the amount of floating methane bubbles and methane hydrates and 3) to study how to sample the acoustical data of methane plumes by using a side scanning SONAR, called SEABAT.

  14. Survey design and extent estimates for the National Lakes Assessment

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) conducted a National Lake Assessment (NLA) in the conterminous USA in 2007 as part of a national assessment of aquatic resources using probability based survey designs. The USEPA Office of Water led the assessment, in cooperation with...

  15. Use of multispectral data in design of forest sample surveys

    NASA Technical Reports Server (NTRS)

    Titus, S. J.; Wensel, L. C.

    1977-01-01

    The use of multispectral data in design of forest sample surveys using a computer software package, WILLIAM, is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.

  16. Engaging Students in Survey Design and Data Collection

    ERIC Educational Resources Information Center

    Sole, Marla A.

    2015-01-01

    Every day, people use data to make decisions that affect their personal and professional lives, trusting that the data are correct. Many times, however, the data are inaccurate, as a result of a flaw in the design or methodology of the survey used to collect the data. Researchers agree that only questions that are clearly worded, unambiguous, free…

  17. Automatic detection and quantitative assessment of peculiar galaxy pairs in Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Shamir, Lior; Wallin, John

    2014-10-01

    We applied computational tools for automatic detection of peculiar galaxy pairs. We first detected in Sloan Digital Sky Survey DR7 ˜400 000 galaxy images with i magnitude <18 that had more than one point spread function, and then applied a machine learning algorithm that detected ˜26 000 galaxy images that had morphology similar to the morphology of galaxy mergers. That data set was mined using a novelty detection algorithm, producing a short list of 500 most peculiar galaxies as quantitatively determined by the algorithm. Manual examination of these galaxies showed that while most of the galaxy pairs in the list were not necessarily peculiar, numerous unusual galaxy pairs were detected. In this paper, we describe the protocol and computational tools used for the detection of peculiar mergers, and provide examples of peculiar galaxy pairs that were detected.

  18. Technical note: Designing and analyzing quantitative factorial experiments.

    PubMed

    St-Pierre, N R; Weiss, W P

    2009-09-01

    The response of a biological process to various factors is generally nonlinear, with many interactions among those factors. Although meta-analyses of data across multiple studies can help in identifying and quantifying interactions among factors, missing latent variables can result in serious misinterpretation. Eventually, all influential factors have to be studied simultaneously in one single experiment. Because of the curvature of the expected response and the presence of interactions among factors, the size of experiments grows very large, even when only 3 or 4 factors are fully arranged. There exists a class of experimental designs, named central composite designs (CCD), that considerably reduces the number of treatments required to estimate all the terms of a second-order polynomial equation without any loss of efficiency compared with the full factorial design. The objective of this technical note is to explain the construction of a CCD and its statistical analysis using the Statistical Analysis System. In short, a CCD consists of 2(k) treatment points (a first-order factorial design, where k represents the number of factors), augmented by at least one center point and 2 x k axial treatments. For 3 factors, the resulting design has 16 treatment points, compared with 27 for a full factorial design. For 4 factors, the CCD has 25 treatment points, compared with 81 for a full factorial design. The CCD can be made orthogonal (no correlation between parameter estimates) or rotatable (the variance of the estimated response is a function only of the distance from the design center and not the direction) by the location of the axial treatments. In spite of the reduced number of treatments compared with a full 3(k) factorial, the CCD is relatively efficient in estimation of the quadratic and interaction terms. Blocking of experimental units is often desirable and is sometimes required. Randomized block designs for CCD are found in some statistical design textbooks. The construction of incomplete, balanced (or near-balanced) designs for CCD experimental layouts is explained using an example. The Statistical Analysis System statements used to analyze a CCD, to identify the significant parameters in the polynomial equation, and to produce parameter estimates are presented and explained. PMID:19700721

  19. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Post-approval alterations to survey design. 1340.11... Post-approval alterations to survey design. After NHTSA approval of a survey design, States shall submit for NHTSA approval any proposed alteration to their survey design, including, but not limited...

  20. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Post-approval alterations to survey design. 1340.11... Post-approval alterations to survey design. After NHTSA approval of a survey design, States shall submit for NHTSA approval any proposed alteration to their survey design, including, but not limited...

  1. 23 CFR 1340.11 - Post-approval alterations to survey design.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Post-approval alterations to survey design. 1340.11... Post-approval alterations to survey design. After NHTSA approval of a survey design, States shall submit for NHTSA approval any proposed alteration to their survey design, including, but not limited...

  2. Survey of quantitative antimicrobial consumption per production stage in farrow-to-finish pig farms in Spain

    PubMed Central

    Moreno, Miguel A.

    2014-01-01

    Objectives To characterise antimicrobial use (AMU) per production stage in terms of drugs, routes of application, indications, duration and exposed animals in farrow-to-finish pig farms in Spain. Design Survey using a questionnaire on AMU during the six months prior to the interview, administered in face-to-face interviews completed from April to October 2010. Participants 108 potentially eligible farms covering all the country were selected using a multistage sampling methodology; of these, 33 were excluded because they did not fulfil the participation criteria and 49 were surveyed. Results The rank of the most used antimicrobials per farm and production stage and administration route started with polymyxins (colistin) by feed during the growing and the preweaning phases, followed by β-lactams by feed during the growing and the preweaning phases and by injection during the preweaning phase. Conclusions The study demonstrates that the growing stage (from weaning to the start of finishing) has the highest AMU according to different quantitative indicators (number of records, number of antimicrobials used, percentage of farms reporting use, relative number of exposed animals per farm and duration of exposure); feed is the administration route that produces the highest antimicrobial exposure based on the higher number of exposed animals and the longer duration of treatment; and there are large differences in AMU among individual pig farms. PMID:26392868

  3. Multiwavelength CO2 DIAL system designed for quantitative concentration measurement

    NASA Astrophysics Data System (ADS)

    Leonelli, Joseph; van der Laan, Jan; Holland, Peter; Fletcher, Leland; Warren, Russell

    A multiwavelength CO2 direct-detection differential absorption lidar (DIAL) system capable of providing range-resolved vapor-concentration contour plots of a 1000 sq m grid at 20-m spatial resolution in 10-s intervals is reported. Design goals are outlined along with system specifications. The self-contained mobile system is modular in design and can detect, identify, quantify, and map chemical vapor clouds having significant spectral structure in the 9- to 11-micron region. The lidar package and data system are described, and focus is placed on the development of pulse clippers.

  4. Optimum structural design with plate bending elements - A survey

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.; Prasad, B.

    1981-01-01

    A survey is presented of recently published papers in the field of optimum structural design of plates, largely with respect to the minimum-weight design of plates subject to such constraints as fundamental frequency maximization. It is shown that, due to the availability of powerful computers, the trend in optimum plate design is away from methods tailored to specific geometry and loads and toward methods that can be easily programmed for any kind of plate, such as finite element methods. A corresponding shift is seen in optimization from variational techniques to numerical optimization algorithms. Among the topics covered are fully stressed design and optimality criteria, mathematical programming, smooth and ribbed designs, design against plastic collapse, buckling constraints, and vibration constraints.

  5. Acoustic Surveys of Methane Plumes by Quantitative Echo Sounder in the Eastern Margin of Japan Sea

    NASA Astrophysics Data System (ADS)

    Aoyama, C.; Matsumoto, R.

    2009-12-01

    During methane hydrate exploration and research, remote and on board acoustic surveying and monitoring of methane hydrate can be easily and economically conducted using a quantitative echo sounder.Simultaneously, the structure and the floating up speed of methane plumes can be obtained from an analysis of acoustic data.We conducted a survey of methane plumes from 2004 through 2008 at a spur situated southwest off the coast of Sado Island, tentatively called Umitaka Spur and at the Joetsu Knoll.In 2007 and 2008, we performed experiments by releasing methane hydrate bubbles and methane hydrate, and letting them float upward. Consequently, we demonstrated that acoustical reflection from the methane plumes correlates with water temperature and depth, that the floating up speed is constant but depends on the conditions of methane hydrate, that the discharge of methane hydrate bubbles changes, and that there is a wide scattering of materials below the seafloor where methane plumes are located.The method will be applied not only to basic research on methane hydrate but also to assessments of the environmental impact of methane hydrate exploitation.

  6. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  7. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    SciTech Connect

    Ivezic, Z.; Axelrod, T.; Brandt, W.N.; Burke, D.L.; Claver, C.F.; Connolly, A.; Cook, K.H.; Gee, P.; Gilmore, D.K.; Jacoby, S.H.; Jones, R.L.; Kahn, S.M.; Kantor, J.P.; Krabbendam, V.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Saha, A.; Schalk, T.L.; Schneider, D.P.; Strauss, Michael A.; /Washington U., Seattle, Astron. Dept. /LSST Corp. /Penn State U., Astron. Astrophys. /KIPAC, Menlo Park /NOAO, Tucson /LLNL, Livermore /UC, Davis /Princeton U., Astrophys. Sci. Dept. /Naval Observ., Flagstaff /Arizona U., Astron. Dept. - Steward Observ. /UC, Santa Cruz /Harvard U. /Johns Hopkins U. /Illinois U., Urbana

    2011-10-14

    In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST). LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg{sup 2} field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg{sup 2} with {delta} < +34.5{sup o}, and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the LSST science drivers led to these choices of system parameters.

  8. The Hong Kong mental morbidity survey: background and study design.

    PubMed

    Lam, L C W; Chan, W C; Wong, C S M; Chen, E Y H; Ng, R M K; Lee, E H M; Chang, W C; Hung, S F; Cheung, E F C; Sham, P C; Chiu, H F K; Lam, M; Chiang, T P; van Os, J; Lau, J T F; Lewis, G; Bebbington, P

    2014-03-01

    Mental disorders are highly prevalent conditions with immense disease burden. To inform health and social services policy formulation, local psychiatric epidemiological data are required. The Hong Kong Mental Morbidity Survey is a 3-year population-based study in which 5700 community-dwelling Chinese adults aged between 16 and 75 years were interviewed with the aim of evaluating the prevalence, co-morbidity, functional impairment, physical morbidity, and social determinants of significant mental disorders in the population. This paper describes the background and design of the survey, and is the first territory-wide psychiatric epidemiological study in Hong Kong. PMID:24676485

  9. Quantitative structural design of high voltage potted electronic modules

    NASA Astrophysics Data System (ADS)

    Tweedie, A. T.; Lieberman, P. A.

    Failure analysis of traveling wave tubes (TWT's) revealed high voltage arc-overs due to cracks in the potting material. It is suggested that the geometric features of the design caused stresses during thermal cycling, which were less than the static strength of the material, but high enough to cause slow crack growth from flaws in the material. The USAF helped sponsor a program to investigate this phenomenon in relation to the 40 W 293H TWT, and to develop a new design which would have high reliability with respect to potting compound structural failures. An iterated design-analysis process was used and coupled with life predictions based on an understanding of the fracture mechanics of the materials involved. The fundamental design data and analysis procedures consisted of: (1) materials characterization; (2) stress, dynamics and thermal analysis of the TWT and its redesign; (3) measurement of the rate of crack growth in the potting compound as a function of stress and temperature; and (4) life prediction of the redesigned TWT.

  10. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more reliability and coverage to each measurement, taking advantages from the strong points of each technique.

  11. The ZInEP Epidemiology Survey: background, design and methods.

    PubMed

    Ajdacic-Gross, Vladeta; Müller, Mario; Rodgers, Stephanie; Warnke, Inge; Hengartner, Michael P; Landolt, Karin; Hagenmuller, Florence; Meier, Magali; Tse, Lee-Ting; Aleksandrowicz, Aleksandra; Passardi, Marco; Knöpfli, Daniel; Schönfelder, Herdis; Eisele, Jochen; Rüsch, Nicolas; Haker, Helene; Kawohl, Wolfram; Rössler, Wulf

    2014-12-01

    This article introduces the design, sampling, field procedures and instruments used in the ZInEP Epidemiology Survey. This survey is one of six ZInEP projects (Zürcher Impulsprogramm zur nachhaltigen Entwicklung der Psychiatrie, i.e. the "Zurich Program for Sustainable Development of Mental Health Services"). It parallels the longitudinal Zurich Study with a sample comparable in age and gender, and with similar methodology, including identical instruments. Thus, it is aimed at assessing the change of prevalence rates of common mental disorders and the use of professional help and psychiatric sevices. Moreover, the current survey widens the spectrum of topics by including sociopsychiatric questionnaires on stigma, stress related biological measures such as load and cortisol levels, electroencephalographic (EEG) and near-infrared spectroscopy (NIRS) examinations with various paradigms, and sociophysiological tests. The structure of the ZInEP Epidemiology Survey entails four subprojects: a short telephone screening using the SCL-27 (n of nearly 10,000), a comprehensive face-to-face interview based on the SPIKE (Structured Psychopathological Interview and Rating of the Social Consequences for Epidemiology: the main instrument of the Zurich Study) with a stratified sample (n = 1500), tests in the Center for Neurophysiology and Sociophysiology (n = 227), and a prospective study with up to three follow-up interviews and further measures (n = 157). In sum, the four subprojects of the ZInEP Epidemiology Survey deliver a large interdisciplinary database. PMID:24942564

  12. Exploring the utility of quantitative network design in evaluating Arctic sea-ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-03-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve ten-day to five-month sea-ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett Ice Severity Index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea-ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  13. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-08-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve 10-day to 5-month sea ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett ice severity index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  14. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    It is widely accepted that design features are one of the most attractive integration method of most fields of engineering activities such as a design modelling, process planning or production scheduling. One of the most important tasks which are realized in the integration process of design and planning functions is a design translation meant as design data mapping into data which are important from process planning needs point of view, it is manufacturing data. A design geometrical shape translation process can be realized with application one of the following strategies: (i) designing with previously prepared design features library also known as DBF method it is design by feature, (ii) interactive design features recognition IFR, (iii) automatic design features recognition AFR. In case of the DBF method design geometrical shape is created with design features. There are two basic approaches for design modelling in DBF method it is classic in which a part design is modelled from beginning to end with application design features previously stored in a design features data base and hybrid where part is partially created with standard predefined CAD system tools and the rest with suitable design features. Automatic feature recognition consist in an autonomic searching of a product model represented with a specific design representation method in order to find those model features which might be potentially recognized as design features, manufacturing features, etc. This approach needs the searching algorithm to be prepared. The searching algorithm should allow carrying on the whole recognition process without a user supervision. Currently there are lots of AFR methods. These methods need the product model to be represented with B-Rep representation most often, CSG rarely, wireframe very rarely. In the IFR method potential features are being recognized by a user. This process is most often realized by a user who points out those surfaces which seem to belong to a currently identified feature. In the IFR method system designer defines a set of features and sets a collection of recognition process parameters. It allows to unambiguously identifying individual features in automatic or semiautomatic way directly in CAD system or in an external application to which the part model might be transferred. Additionally a user is able to define non-geometrical information such as: overall dimensions, surface roughness etc. In this paper a survey on methods of features identification and recognition is presented especially in context of AFR methods.

  15. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... the state's participation in a qualifying regional survey, and the survey's sample design, data...) Meet NMFS survey design and data collection standards. ... designation based on submission of recreational survey data. 600.1417 Section 600.1417 Wildlife and...

  16. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... the state's participation in a qualifying regional survey, and the survey's sample design, data...) Meet NMFS survey design and data collection standards. ... designation based on submission of recreational survey data. 600.1417 Section 600.1417 Wildlife and...

  17. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... the state's participation in a qualifying regional survey, and the survey's sample design, data...) Meet NMFS survey design and data collection standards. ... designation based on submission of recreational survey data. 600.1417 Section 600.1417 Wildlife and...

  18. Quantitative Modeling of Selective Lysosomal Targeting for Drug Design

    PubMed Central

    Rosania, Gus R.; Horobin, Richard W.; Kornhuber, Johannes

    2009-01-01

    Lysosomes are acidic organelles and are involved in various diseases, the most prominent is malaria. Accumulation of molecules in the cell by diffusion from the external solution into cytosol, lysosome and mitochondrium was calculated with the Fick-Nernst-Planck-equation. The cell model considers the diffusion of neutral and ionic molecules across biomembranes, dissociation to mono- or bivalent ions, adsorption to lipids, and electrical attraction or repulsion. Based on simulation results, high and selective accumulation in lysosomes was found for weak mono- and bivalent bases with intermediate to high log Kow. These findings were validated with experimental results and by a comparison to the properties of antimalarial drugs in clinical use. For ten active compounds, nine were predicted to accumulate to a greater extent in lysosomes than in other organelles, six of these were in the optimum range predicted by the model and three were close. Five of the antimalarial drugs were lipophilic weak dibasic compounds. The predicted optimum properties for a selective accumulation of weak bivalent bases in lysosomes are consistent with experimental values and are more accurate than any prior calculation. This demonstrates that the cell model can be a useful tool for the design of effective lysosome-targeting drugs with minimal off-target interactions. PMID:18504571

  19. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  20. SEDS: THE SPITZER EXTENDED DEEP SURVEY. SURVEY DESIGN, PHOTOMETRY, AND DEEP IRAC SOURCE COUNTS

    SciTech Connect

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Hernquist, L.; Hora, J. L.; Arendt, R.; Barmby, P.; Barro, G.; Faber, S.; Guhathakurta, P.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Finlator, K.; Grogin, N. A.; and others

    2013-05-20

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg{sup 2} to a depth of 26 AB mag (3{sigma}) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 {mu}m. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 {+-} 1.0 and 4.4 {+-} 0.8 nW m{sup -2} sr{sup -1} at 3.6 and 4.5 {mu}m to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  1. SEDS: The Spitzer Extended Deep Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Technical Reports Server (NTRS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Arendt, A.; Barmby, P.; Barro, G; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Faber, S.; Finlator, K.; Grogin, N. A.; Guhathakurta, P.; Hernquist, L.; Hora, J. L.; Illingworth, G.; Kashlinsky, A; Koekmoer, A. M.; Koo, D. C.; Moseley, H.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg(exp 2) to a depth of 26 AB mag (3sigma) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 micron. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 +/- 1.0 and 4.4 +/- 0.8 nW / square m/sr at 3.6 and 4.5 micron to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  2. Design Considerations: Falcon M Dwarf Habitable Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Polsgrove, Daniel; Novotny, Steven; Della-Rose, Devin J.; Chun, Francis; Tippets, Roger; O'Shea, Patrick; Miller, Matthew

    2016-01-01

    The Falcon Telescope Network (FTN) is an assemblage of twelve automated 20-inch telescopes positioned around the globe, controlled from the Cadet Space Operations Center (CSOC) at the US Air Force Academy (USAFA) in Colorado Springs, Colorado. Five of the 12 sites are currently installed, with full operational capability expected by the end of 2016. Though optimized for studying near-earth objects to accomplish its primary mission of Space Situational Awareness (SSA), the Falcon telescopes are in many ways similar to those used by ongoing and planned exoplanet transit surveys targeting individual M dwarf stars (e.g., MEarth, APACHE, SPECULOOS). The network's worldwide geographic distribution provides additional potential advantages. We have performed analytical and empirical studies exploring the viability of employing the FTN for a future survey of nearby late-type M dwarfs tailored to detect transits of 1-2REarth exoplanets in habitable-zone orbits . We present empirical results on photometric precision derived from data collected with multiple Falcon telescopes on a set of nearby (< 25 pc) M dwarfs using infrared filters and a range of exposure times, as well as sample light curves created from images gathered during known transits of varying transit depths. An investigation of survey design parameters is also described, including an analysis of site-specific weather data, anticipated telescope time allocation and the percentage of nearby M dwarfs with sufficient check stars within the Falcons' 11' x 11' field-of-view required to perform effective differential photometry. The results of this ongoing effort will inform the likelihood of discovering one (or more) habitable-zone exoplanets given current occurrence rate estimates over a nominal five-year campaign, and will dictate specific survey design features in preparation for initiating project execution when the FTN begins full-scale automated operations.

  3. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level. PMID:26202064

  4. Questionnaire survey of physicians: Design and practical use in nephrology

    PubMed Central

    Agrawal, Varun; Garimella, P. S.; Roshan, S. J.; Ghosh, A. K.

    2009-01-01

    As medicine grows in complexity, it is imperative for physicians to update their knowledge base and practice to reflect current standards of care. Postgraduate training offers a golden opportunity for resident physicians to create a strong foundation of concepts in medicine. There is a need for assessing the knowledge of residents regarding established clinical practice guidelines and their perceptions regarding patient care and management. In this paper, we review how questionnaire surveys can be designed and applied to identify significant gaps in resident knowledge and inappropriate attitudes and beliefs. This evaluation has important implications for program directors who can then initiate measures to improve resident education. Such efforts during residency training have the potential of improving patient outcomes. We discuss the design of the questionnaire, its pre-testing and validity measures, online distribution, efficient response collection, data analysis, and possible future research. Finally, we illustrate this method of educational research with a questionnaire survey designed to measure the awareness of chronic kidney disease among internal medicine residents. PMID:20368922

  5. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Submission and approval of seat belt survey design. 1340... § 1340.10 Submission and approval of seat belt survey design. (a) Contents: The following information shall be included in the State's seat belt survey design submitted for NHTSA approval: (1) Sample...

  6. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Submission and approval of seat belt survey design. 1340... § 1340.10 Submission and approval of seat belt survey design. (a) Contents: The following information shall be included in the State's seat belt survey design submitted for NHTSA approval: (1) Sample...

  7. 23 CFR 1340.10 - Submission and approval of seat belt survey design.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Submission and approval of seat belt survey design. 1340... § 1340.10 Submission and approval of seat belt survey design. (a) Contents: The following information shall be included in the State's seat belt survey design submitted for NHTSA approval: (1) Sample...

  8. The Large Synoptic Survey Telescope preliminary design overview

    NASA Astrophysics Data System (ADS)

    Krabbendam, V. L.; Sweeney, D.

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) Project is a public-private partnership that is well into the design and development of the complete observatory system to conduct a wide fast deep survey and to process and serve the data. The telescope has a 3-mirror wide field optical system with an 8.4 meter primary, 3.4 meter secondary, and 5 meter tertiary mirror. The reflective optics feed three refractive elements and a 64 cm 3.2 gigapixel camera. The LSST data management system will reduce, transport, alert and archive the roughly 15 terabytes of data produced nightly, and will serve the raw and catalog data accumulating at an average of 7 petabytes per year to the community without any proprietary period. The project has completed several data challenges designed to prototype and test the data management system to significant pre-construction levels. The project continues to attract institutional partners and has acquired non-federal funding sufficient to construct the primary mirror, already in progress at the University of Arizona, build the secondary mirror substrate, completed by Corning, and fund detector prototype efforts, several that have been tested on the sky. A focus of the project is systems engineering, risk reduction through prototyping and major efforts in image simulation and operation simulations. The project has submitted a proposal for construction to the National Science Foundation Major Research Equipment and Facilities Construction (MREFC) program and has prepared project advocacy papers for the National Research Council's Astronomy 2010 Decadal Survey. The project is preparing for a 2012 construction funding authorization.

  9. Skin Microbiome Surveys Are Strongly Influenced by Experimental Design.

    PubMed

    Meisel, Jacquelyn S; Hannigan, Geoffrey D; Tyldsley, Amanda S; SanMiguel, Adam J; Hodkinson, Brendan P; Zheng, Qi; Grice, Elizabeth A

    2016-05-01

    Culture-independent studies to characterize skin microbiota are increasingly common, due in part to affordable and accessible sequencing and analysis platforms. Compared to culture-based techniques, DNA sequencing of the bacterial 16S ribosomal RNA (rRNA) gene or whole metagenome shotgun (WMS) sequencing provides more precise microbial community characterizations. Most widely used protocols were developed to characterize microbiota of other habitats (i.e., gastrointestinal) and have not been systematically compared for their utility in skin microbiome surveys. Here we establish a resource for the cutaneous research community to guide experimental design in characterizing skin microbiota. We compare two widely sequenced regions of the 16S rRNA gene to WMS sequencing for recapitulating skin microbiome community composition, diversity, and genetic functional enrichment. We show that WMS sequencing most accurately recapitulates microbial communities, but sequencing of hypervariable regions 1-3 of the 16S rRNA gene provides highly similar results. Sequencing of hypervariable region 4 poorly captures skin commensal microbiota, especially Propionibacterium. WMS sequencing, which is resource and cost intensive, provides evidence of a community's functional potential; however, metagenome predictions based on 16S rRNA sequence tags closely approximate WMS genetic functional profiles. This study highlights the importance of experimental design for downstream results in skin microbiome surveys. PMID:26829039

  10. Quantitative Feedback Theory (QFT) applied to the design of a rotorcraft flight control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Gorder, P. J.

    1992-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. Quantitative Feedback Theory is applied to the design of the longitudinal flight control system for a linear uncertain model of the AH-64 rotorcraft. In this model, the uncertainty is assigned, and is assumed to be attributable to actual uncertainty in the dynamic model and to the changes in the vehicle aerodynamic characteristics which occur near hover. The model includes an approximation to the rotor and actuator dynamics. The design example indicates the manner in which handling qualities criteria may be incorporated into the design of realistic rotorcraft control systems in which significant uncertainty exists in the vehicle model.

  11. Image resolution analysis: A new, robust approach to seismic survey design

    NASA Astrophysics Data System (ADS)

    Tzimeas, Constantinos

    Seismic survey design methods often rely on qualitative measures to provide an optimal image of their objective target. Fold, ray tracing techniques counting ray hits on binned interfaces, and even advanced 3-D survey design methods that try to optimize offset and azimuth coverage are prone to fail (especially in complex geological or structural settings) in their imaging predictions. The reason for the potential failure of these commonly used approaches derives from the fact that they do not take into account the ray geometry at the target points. Inverse theory results can provide quantitative and objective constraints on acquisition design. Beylkin's contribution to this field is an elegant and simple equation describing a reconstructed point scatterer given the source/receiver distribution used in the imaging experiment. Quantitative measures of spatial image resolution were developed to assess the efficacy of competing acquisition geometries. Apart from the source/receiver configuration, parameters such as the structure and seismic velocity also influence image resolution. Understanding their effect on image quality, allows us to better interpret the resolution results for the surveys under examination. A salt model was used to simulate imaging of target points located underneath and near the flanks of the diapir. Three different survey designs were examined. Results from these simulations show that contrary to simple models, near-offsets do not always produce better resolved images than far-offsets. However, consideration of decreasing signal-to-noise ratio revealed that images obtained from the far-offset experiment are degrading faster than the near-offset ones. The image analysis was performed on VSP field data as well as synthetics generated by finite difference forward modeling. The predicted image resolution results were compared to measured resolution from the migrated sections of both the field data and the synthetics. This comparison confirms that image resolution analysis provides as good a resolution prediction as the prestack Kirchhoff depth migrated section of the synthetic gathers. Even in the case of the migrated field data, despite the presence of error introducing factors (different signal-to-noise ratios, shape and frequency content of source wavelets, etc.), image resolution performed well exhibiting the same trends of resolution changes at different test points.

  12. [Software and hardware design for the temperature control system of quantitative polymerase chain reaction].

    PubMed

    Qiu, Xian-bo; Yuan, Jing-qi; Li, Qi

    2005-07-01

    A temperature control system for quantitive polymerase chain reaction (PCR) is presented in the paper with both software and hardware configuration. The performance of the control system has been improved by optimizing the software and hardware design according to the system's properties. The control system has been proven to have a good repeatability and reliability as well as high control precision. PMID:16268349

  13. Quantitative Survey and Structural Classification of Hydraulic Fracturing Chemicals Reported in Unconventional Gas Production.

    PubMed

    Elsner, Martin; Hoelzer, Kathrin

    2016-04-01

    Much interest is directed at the chemical structure of hydraulic fracturing (HF) additives in unconventional gas exploitation. To bridge the gap between existing alphabetical disclosures by function/CAS number and emerging scientific contributions on fate and toxicity, we review the structural properties which motivate HF applications, and which determine environmental fate and toxicity. Our quantitative overview relied on voluntary U.S. disclosures evaluated from the FracFocus registry by different sources and on a House of Representatives ("Waxman") list. Out of over 1000 reported substances, classification by chemistry yielded succinct subsets able to illustrate the rationale of their use, and physicochemical properties relevant for environmental fate, toxicity and chemical analysis. While many substances were nontoxic, frequent disclosures also included notorious groundwater contaminants like petroleum hydrocarbons (solvents), precursors of endocrine disruptors like nonylphenols (nonemulsifiers), toxic propargyl alcohol (corrosion inhibitor), tetramethylammonium (clay stabilizer), biocides or strong oxidants. Application of highly oxidizing chemicals, together with occasional disclosures of putative delayed acids and complexing agents (i.e., compounds designed to react in the subsurface) suggests that relevant transformation products may be formed. To adequately investigate such reactions, available information is not sufficient, but instead a full disclosure of HF additives is necessary. PMID:26902161

  14. The Brightest of Reionizing Galaxies Survey: Design and Key Results

    NASA Astrophysics Data System (ADS)

    Stiavelli, Massimo; Trenti, M.; Collective, BoRG

    2012-01-01

    The Brightest of Reionizing Galaxies survey (BoRG) is a large Hubble Space Telescope program aimed at searching very luminous (M_AB -21) galaxies at z 8 from wide-area, medium-deep observations in four filters (V, J, Y, H) reaching a median sensitivity of m_AB 27 at 5sigma over 250 arcmin2. The pure-parallel nature of BoRG allows us to obtain a census of the population of rare galaxies within the epoch of reionization (700 million years after the Big Bang) without being affected by cosmic variance. The observations carried out to date led to the identification of 7 very bright candidates, two of which have been followed-up with spectroscopic observations at Keck. BoRG has been extended for the next cycle of HST observations and will cover an additional area of about 100 arcmin2 with longer exposure time, constructing a sample of observations at different depths to optimally probe the bright end of the galaxy luminosity function at z 8. In this poster we present the survey design, its key results to date and future prospects.

  15. Induced Polarization Surveying for Acid Rock Screening in Highway Design

    NASA Astrophysics Data System (ADS)

    Butler, K. E.; Al, T.; Bishop, T.

    2004-05-01

    Highway and pipeline construction agencies have become increasingly vigilant in their efforts to avoid cutting through sulphide-bearing bedrock that has potential to produce acid rock drainage. Blasting and fragmentation of such rock increases the surface area available for sulphide oxidation and hence increases the risk of acid rock drainage unless the rock contains enough natural buffering capacity to neutralize the pH. In December, 2001, the New Brunswick Department of Transportation (NBOT) sponsored a field trial of geophysical surveying in order to assess its suitability as a screening tool for locating near-surface sulphides along proposed highway alignments. The goal was to develop a protocol that would allow existing programs of drilling and geochemical testing to be targeted more effectively, and provide design engineers with the information needed to reduce rock cuts where necessary and dispose of blasted material in a responsible fashion. Induced polarization (IP) was chosen as the primary geophysical method given its ability to detect low-grade disseminated mineralization. The survey was conducted in dipole-dipole mode using an exploration-style time domain IP system, dipoles 8 to 25 m in length, and six potential dipoles for each current dipole location (i.e. n = 1 - 6). Supplementary information was provided by resistivity and VLF-EM surveys sensitive to lateral changes in electrical conductivity, and by magnetic field surveying chosen for its sensitivity to the magnetic susceptibility of pyrrhotite. Geological and geochemical analyses of samples taken from several IP anomalies located along 4.3 line-km of proposed highway confirmed the effectiveness of the screening technique. IP pseudosections from a region of metamorphosed shales and volcaniclastic rocks identified discrete, well-defined mineralized zones. Stronger, overlapping, and more laterally extensive IP anomalies were observed over a section of graphitic and sulphide-bearing metasedimentary rocks. Attempts to use spectral IP characteristics to determine relative abundances of sulphides and graphite were not conclusive. The overall effectiveness of the screening technique however encouraged NBDOT to apply it to an additional 50 km of planned rock cuts along the corridor selected for the new Trans-Canada Highway.

  16. Discrimination, Personality, and Achievement: A Survey of Northern Blacks. Quantitative Studies in Social Relations Series.

    ERIC Educational Resources Information Center

    Crain, Robert L.; Weisman, Carol Sachs

    In the Spring of 1966, the Civil Rights Commission asked the National Opinion Research Center (NORC) to conduct a survey of Northern blacks to determine the effects, if any, of attending integrated versus segregated schools. The result was an extensive survey of 1651 black men and women, aged 21 to 45, living in the metropolitan areas of the…

  17. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.

    PubMed

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M

    2016-01-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents. PMID:27147293

  18. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design

    PubMed Central

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M.

    2016-01-01

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared – non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents. PMID:27147293

  19. Study of Nurses’ Knowledge about Palliative Care: A Quantitative Cross-sectional Survey

    PubMed Central

    Prem, Venkatesan; Karvannan, Harikesavan; Kumar, Senthil P; Karthikbabu, Surulirajan; Syed, Nafeez; Sisodia, Vaishali; Jaykumar, Saroja

    2012-01-01

    Context: Studies have documented that nurses and other health care professionals are inadequately prepared to care for patients in palliative care. Several reasons have been identified including inadequacies in nursing education, absence of curriculum content related to pain management, and knowledge related to pain and palliative care. Aims: The objective of this paper was to assess the knowledge about palliative care amongst nursing professionals using the palliative care knowledge test (PCKT). Settings and Design: Cross-sectional survey of 363 nurses in a multispecialty hospital. Materials and Methods: The study utilized a self-report questionnaire- PCKT developed by Nakazawa et al., which had 20 items (statements about palliative care) for each of which the person had to indicate ‘correct’, ‘incorrect’, or ‘unsure.’ The PCKT had 5 subscales (philosophy- 2 items, pain- 6 items, dyspnea- 4 items, psychiatric problems- 4 items, and gastro-intestinal problems- 4 items). Statistical Analysis Used: Comparison across individual and professional variables for both dimensions were done using one-way ANOVA, and correlations were done using Karl-Pearson's co-efficient using SPSS version 16.0 for Windows. Results: The overall total score of PCKT was 7.16 ± 2.69 (35.8%). The philosophy score was 73 ± .65 (36.5%), pain score was 2.09 ± 1.19 (34.83%), dyspnea score was 1.13 ± .95 (28.25%), psychiatric problems score was 1.83 ± 1.02 (45.75%), and gastro-intestinal problems score was 1.36 ± .97 (34%). (P = .00). The female nurses scored higher than their male counterparts, but the difference was not significant (P > .05). Conclusions: Overall level of knowledge about palliative care was poor, and nurses had a greater knowledge about psychiatric problems and philosophy than the other aspects indicated in PCKT. PMID:23093828

  20. Pragmatic soil survey design using flexible Latin hypercube sampling

    NASA Astrophysics Data System (ADS)

    Clifford, David; Payne, James E.; Pringle, M. J.; Searle, Ross; Butler, Nathan

    2014-06-01

    We review and give a practical example of Latin hypercube sampling in soil science using an approach we call flexible Latin hypercube sampling. Recent studies of soil properties in large and remote regions have highlighted problems with the conventional Latin hypercube sampling approach. It is often impractical to travel far from tracks and roads to collect samples, and survey planning should recognise this fact. Another problem is how to handle target sites that, for whatever reason, are impractical to sample - should one just move on to the next target or choose something in the locality that is accessible? Working within a Latin hypercube that spans the covariate space, selecting an alternative site is hard to do optimally. We propose flexible Latin hypercube sampling as a means of avoiding these problems. Flexible Latin hypercube sampling involves simulated annealing for optimally selecting accessible sites from a region. The sampling protocol also produces an ordered list of alternative sites close to the primary target site, should the primary target site prove inaccessible. We highlight the use of this design through a broad-scale sampling exercise in the Burdekin catchment of north Queensland, Australia. We highlight the robustness of our design through a simulation study where up to 50% of target sites may be inaccessible.

  1. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  2. Quantitative Survey and Structural Classification of Fracking Chemicals Reported in Unconventional Gas Exploitation

    NASA Astrophysics Data System (ADS)

    Elsner, Martin; Schreglmann, Kathrin

    2015-04-01

    Few technologies are being discussed in such controversial terms as hydraulic fracturing ("fracking") in the recovery of unconventional gas. Particular concern regards the chemicals that may return to the surface as a result of hydraulic fracturing. These are either "fracking chemicals" - chemicals that are injected together with the fracking fluid to optimize the fracturing performance or geogenic substances which may turn up during gas production, in the so-called produced water originating from the target formation. Knowledge about them is warranted for several reasons. (1) Monitoring. Air emissions are reported to arise from well drilling, the gas itself or condensate tanks. In addition, potential spills and accidents bear the danger of surface and shallow groundwater contaminations. Monitoring strategies are therefore warranted to screen for "indicator" substances of potential impacts. (2) Chemical Analysis. To meet these analytical demands, target substances must be defined so that adequate sampling approaches and analytical methods can be developed. (3) Transformation in the Subsurface. Identification and classification of fracking chemicals (aromatics vs. alcohols vs. acids, esters, etc.) is further important to assess the possibility of subsurface reactions which may potentially generate new, as yet unidentified transformation products. (4) Wastewater Treatment. For the same reason chemical knowledge is important for optimized wastewater treatment strategies. (5) Human and Ecosystem Health. Knowledge of the most frequent fracking chemicals is further essential for risk assessment (environmental behavior, toxicity) (6) Public Discussions. Finally, an overview of reported fracking chemicals can provide unbiased scientific into current public debates and enable critical reviews of Green Chemistry approaches. Presently, however, such information is not readily available. We aim to close this knowledge gap by providing a quantitative overview of chemical additives reported for use in hydraulic fracturing. For the years 2005-2009 it is based on the Waxman report, and for the years 2011-2013 it relies on the database FracFocus, where it makes use of the data extracted and provided by the website "SkyTruth". For the first time, we list fracking chemicals according to their chemical structure and functional groups, because these properties are important as a starting point for (i) the design of analytical methods, (ii) to assess environmental fate and (iii) to understand why a given chemical is used at a certain stage of the fracturing process and what possible alternatives exist.

  3. Curriculum Design of Computer Graphics Programs: A Survey of Art/Design Programs at the University Level.

    ERIC Educational Resources Information Center

    McKee, Richard Lee

    This master's thesis reports the results of a survey submitted to over 30 colleges and universities that currently offer computer graphics courses or are in the planning stage of curriculum design. Intended to provide a profile of the computer graphics programs and insight into the process of curriculum design, the survey gathered data on program…

  4. Retrospective pilot feedback survey of 200 users of the AIDA Version 4 Educational Diabetes Program. 1--Quantitative Survey Data.

    PubMed

    Lehmann, Eldon D; Chatu, Sukhdev S; Hashmy, S Sabina H

    2006-06-01

    This column reports a detailed, questionnaire-based, post-release feedback survey of 200 users of the AIDA version 4 educational diabetes simulator. AIDA is a freeware computer program that permits the interactive simulation of plasma insulin and blood glucose profiles for educational, demonstration, self-learning, and research purposes. Since its Internet launch in 1996 over 700,000 visits have been logged to the AIDA Websites-including www.2aida.org-and over 200,000 program copies have been downloaded free-of-charge. The main goals of the current study were: (1) to establish what people have thought about the AIDA program, (2) to assess the utility of the software, and (3) to ascertain how much people have actually used it. An analysis was therefore undertaken of the first 200 feedback forms that were returned by AIDA users. The questionnaire-based survey methodology was found to be robust and reliable. Feedback forms were received from participants in 21 countries. One hundred six of 209 responses (50.7%) were received from people with diabetes, and 36 of 209 (17.2%) from relatives of patients, with lesser numbers from doctors, students, diabetes educators, nurses, pharmacists, and other end users. Please note some respondents fulfilled more than one end-user category, hence the denominator <200; for example, someone with diabetes who was also a doctor. This study has established the feasibility of using a simple feedback form to survey a substantial number of diabetes software users. In addition, it has yielded interesting data in terms of who are the main users of the AIDA program, and has also provided technical (computer) information that has aided the release of a freeware upgrade to the software. In general, users reported finding the program to be of educational value. The majority also felt it would be of interest to diabetes educators and people with diabetes. Most were clear about its limitations as a simulator-based learning tool. The implications of these findings will be discussed. PMID:16800766

  5. Patchy distribution fields: an interleaved survey design and reconstruction adequacy.

    PubMed

    Kalikhman, I

    2005-11-01

    A mathematical model was used to compare the effects of a regular (one-pass) or interleaved (two-pass) acoustic survey on the adequacy of reconstructing patchy distribution fields. The model simulates fish or plankton patches of different shapes and spatial orientations, and a set of parallel or zigzag transects forming a regular or interleaved acoustic survey. The efficiency of a survey is determined by the adequacy of a reconstructed field to that originally generated, which is evaluated by calculating their correlations. Regarding the immovable fields, the efficiency of a regular or interleaved acoustic survey was tested with the following two alternative assumptions: (1) the entire survey was completed; (2) the survey was interrupted for some reason at the moment when one transect remained non-accomplished. In the former case, the efficiencies of both acoustic surveys were nearly the same; in the latter case, the efficiency of an interleaved survey was superior to that of a regular one. With respect to movable fields, the efficiency of the completed interleaved surveys was even higher than that of the regular ones. Thus, the results obtained allow us to conclude that an interleaved survey is expedient in cases where there is no preference regarding the position of a vessel for further work. PMID:16308787

  6. Textile Materials for the Design of Wearable Antennas: A Survey

    PubMed Central

    Salvado, Rita; Loss, Caroline; Gonçalves, Ricardo; Pinho, Pedro

    2012-01-01

    In the broad context of Wireless Body Sensor Networks for healthcare and pervasive applications, the design of wearable antennas offers the possibility of ubiquitous monitoring, communication and energy harvesting and storage. Specific requirements for wearable antennas are a planar structure and flexible construction materials. Several properties of the materials influence the behaviour of the antenna. For instance, the bandwidth and the efficiency of a planar microstrip antenna are mainly determined by the permittivity and the thickness of the substrate. The use of textiles in wearable antennas requires the characterization of their properties. Specific electrical conductive textiles are available on the market and have been successfully used. Ordinary textile fabrics have been used as substrates. However, little information can be found on the electromagnetic properties of regular textiles. Therefore this paper is mainly focused on the analysis of the dielectric properties of normal fabrics. In general, textiles present a very low dielectric constant that reduces the surface wave losses and increases the impedance bandwidth of the antenna. However, textile materials are constantly exchanging water molecules with the surroundings, which affects their electromagnetic properties. In addition, textile fabrics are porous, anisotropic and compressible materials whose thickness and density might change with low pressures. Therefore it is important to know how these characteristics influence the behaviour of the antenna in order to minimize unwanted effects. This paper presents a survey of the key points for the design and development of textile antennas, from the choice of the textile materials to the framing of the antenna. An analysis of the textile materials that have been used is also presented. PMID:23202235

  7. Textile materials for the design of wearable antennas: a survey.

    PubMed

    Salvado, Rita; Loss, Caroline; Gonalves, Ricardo; Pinho, Pedro

    2012-01-01

    In the broad context of Wireless Body Sensor Networks for healthcare and pervasive applications, the design of wearable antennas offers the possibility of ubiquitous monitoring, communication and energy harvesting and storage. Specific requirements for wearable antennas are a planar structure and flexible construction materials. Several properties of the materials influence the behaviour of the antenna. For instance, the bandwidth and the efficiency of a planar microstrip antenna are mainly determined by the permittivity and the thickness of the substrate. The use of textiles in wearable antennas requires the characterization of their properties. Specific electrical conductive textiles are available on the market and have been successfully used. Ordinary textile fabrics have been used as substrates. However, little information can be found on the electromagnetic properties of regular textiles. Therefore this paper is mainly focused on the analysis of the dielectric properties of normal fabrics. In general, textiles present a very low dielectric constant that reduces the surface wave losses and increases the impedance bandwidth of the antenna. However, textile materials are constantly exchanging water molecules with the surroundings, which affects their electromagnetic properties. In addition, textile fabrics are porous, anisotropic and compressible materials whose thickness and density might change with low pressures. Therefore it is important to know how these characteristics influence the behaviour of the antenna in order to minimize unwanted effects. This paper presents a survey of the key points for the design and development of textile antennas, from the choice of the textile materials to the framing of the antenna. An analysis of the textile materials that have been used is also presented. PMID:23202235

  8. Trajectory Design for the Transiting Exoplanet Survey Satellite

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel J. K.; Williams, Trevor W.; Mendelsohn, Chad R.

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission, scheduled to be launched in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the Schematics Window Methodology (SWM76) launch window analysis tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements. Keywords: resonant orbit, stability, lunar flyby, phasing loops, trajectory optimization

  9. Trajectory Design for the Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Dichmann, Donald J.; Parker, Joel; Williams, Trevor; Mendelsohn, Chad

    2014-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission launching in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the SWM76 launch window tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements.

  10. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  11. National Aquatic Resource Surveys: Integration of Geospatial Data in Their Survey Design and Analysis

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are a series of four statistical surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams...

  12. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. PMID:23523366

  13. A Novel Simulation Technician Laboratory Design: Results of a Survey-Based Study

    PubMed Central

    Hughes, Patrick G; Friedl, Ed; Ortiz Figueroa, Fabiana; Cepeda Brito, Jose R; Frey, Jennifer; Birmingham, Lauren E; Atkinson, Steven Scott

    2016-01-01

    Objective  The purpose of this study was to elicit feedback from simulation technicians prior to developing the first simulation technician-specific simulation laboratory in Akron, OH. Background Simulation technicians serve a vital role in simulation centers within hospitals/health centers around the world. The first simulation technician degree program in the US has been approved in Akron, OH. To satisfy the requirements of this program and to meet the needs of this special audience of learners, a customized simulation lab is essential.  Method A web-based survey was circulated to simulation technicians prior to completion of the lab for the new program. The survey consisted of questions aimed at identifying structural and functional design elements of a novel simulation center for the training of simulation technicians. Quantitative methods were utilized to analyze data. Results Over 90% of technicians (n=65) think that a lab designed explicitly for the training of technicians is novel and beneficial. Approximately 75% of respondents think that the space provided appropriate audiovisual (AV) infrastructure and space to evaluate the ability of technicians to be independent. The respondents think that the lab needed more storage space, visualization space for a large number of students, and more space in the technical/repair area. Conclusions  A space designed for the training of simulation technicians was considered to be beneficial. This laboratory requires distinct space for technical repair, adequate bench space for the maintenance and repair of simulators, an appropriate AV infrastructure, and space to evaluate the ability of technicians to be independent. PMID:27096134

  14. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…

  15. Spatial Statistical Models and Optimal Survey Design for Rapid Geophysical characterization of UXO Sites

    SciTech Connect

    G. Ostrouchov; W.E.Doll; D.A.Wolf; L.P.Beard; M.D. Morris; D.K.Butler

    2003-07-01

    Unexploded ordnance(UXO)surveys encompass large areas, and the cost of surveying these areas can be high. Enactment of earlier protocols for sampling UXO sites have shown the shortcomings of these procedures and led to a call for development of scientifically defensible statistical procedures for survey design and analysis. This project is one of three funded by SERDP to address this need.

  16. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  17. Rotorcraft flight control design using quantitative feedback theory and dynamic crossfeeds

    NASA Technical Reports Server (NTRS)

    Cheng, Rendy P.

    1995-01-01

    A multi-input, multi-output controls design with robust crossfeeds is presented for a rotorcraft in near-hovering flight using quantitative feedback theory (QFT). Decoupling criteria are developed for dynamic crossfeed design and implementation. Frequency dependent performance metrics focusing on piloted flight are developed and tested on 23 flight configurations. The metrics show that the resulting design is superior to alternative control system designs using conventional fixed-gain crossfeeds and to feedback-only designs which rely on high gains to suppress undesired off-axis responses. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets current handling qualities specifications relative to the decoupling of off-axis responses. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensator successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective.

  18. SDSS-IV MaNGA: Survey Design and Progress

    NASA Astrophysics Data System (ADS)

    Yan, Renbin; MaNGA Team

    2016-01-01

    The ongoing SDSS-IV/MaNGA Survey will obtain integral field spectroscopy at a resolution of R~2000 with a wavelength coverage from 3,600A to 10,300A for 10,000 nearby galaxies. Within each 3 degree diameter pointing of the 2.5m Sloan Telescope, we deploy 17 hexagonal fiber bundles with sizes ranging from 12 to 32 arcsec in diameter. The bundles are build with 2 arcsec fibers and have a 56% fill factor. During observations, we obtained sets of exposures at 3 different dither positions to achieve near-critical sampling of the effective point spread function, which has a FWHM about 2.5 arcsec, corresponding to 1-2 kpc for the majority of the galaxies targeted. The flux calibration is done using 12 additional mini-fiber-bundles targeting standard stars simultaneously with science targets, achieving a calibration accuracy better than 5% over 90% of the wavelength range. The target galaxies are selected to ensure uniform spatial coverage in units of effective radii for the majority of the galaxies while maximizing spatial resolution. About 2/3 of the sample is covered out to 1.5Re (primary sample) and 1/3 of the sample covered to 2.5Re (secondary sample). The sample is designed to have approximately equal representation from high and low mass galaxies while maintaining volume-limited selection at fixed absolute magnitudes. We obtain an average S/N of 4 per Angstrom in r-band continuum at a surface brightness of 23 AB arcsec-2. With spectral stacking in an elliptical annulus covering 1-1.5Re, our primary sample galaxies have a median S/N of ~60 per Angstrom in r-band.

  19. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  20. 78 FR 5459 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... in the Design and Development of a Survey Regarding Patient Experiences With Hospital Outpatient... as a CAHPS survey. The survey will be developed in accordance with CAHPS Survey Design Principles and... to this solicitation, please reply via email to AmbSurgSurvey@cms.hhs.gov or by postal mail...

  1. A rare variant association test in family-based designs and non-normal quantitative traits.

    PubMed

    Lakhal-Chaieb, Lajmi; Oualkacha, Karim; Richards, Brent J; Greenwood, Celia M T

    2016-03-15

    Rare variant studies are now being used to characterize the genetic diversity between individuals and may help to identify substantial amounts of the genetic variation of complex diseases and quantitative phenotypes. Family data have been shown to be powerful to interrogate rare variants. Consequently, several rare variants association tests have been recently developed for family-based designs, but typically, these assume the normality of the quantitative phenotypes. In this paper, we present a family-based test for rare-variants association in the presence of non-normal quantitative phenotypes. The proposed model relaxes the normality assumption and does not specify any parametric distribution for the marginal distribution of the phenotype. The dependence between relatives is modeled via a Gaussian copula. A score-type test is derived, and several strategies to approximate its distribution under the null hypothesis are derived and investigated. The performance of the proposed test is assessed and compared with existing methods by simulations. The methodology is illustrated with an association study involving the adiponectin trait from the UK10K project. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26420132

  2. Sample size and optimal sample design in tuberculosis surveys

    PubMed Central

    Sánchez-Crespo, J. L.

    1967-01-01

    Tuberculosis surveys sponsored by the World Health Organization have been carried out in different communities during the last few years. Apart from the main epidemiological findings, these surveys have provided basic statistical data for use in the planning of future investigations. In this paper an attempt is made to determine the sample size desirable in future surveys that include one of the following examinations: tuberculin test, direct microscopy, and X-ray examination. The optimum cluster sizes are found to be 100-150 children under 5 years of age in the tuberculin test, at least 200 eligible persons in the examination for excretors of tubercle bacilli (direct microscopy) and at least 500 eligible persons in the examination for persons with radiological evidence of pulmonary tuberculosis (X-ray). Modifications of the optimum sample size in combined surveys are discussed. PMID:5300008

  3. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the state's participation in a qualifying regional survey, and the survey's sample design, data... designation based on submission of recreational survey data. 600.1417 Section 600.1417 Wildlife and Fisheries... Requirements for exempted state designation based on submission of recreational survey data. (a) To...

  4. 50 CFR 600.1417 - Requirements for exempted state designation based on submission of recreational survey data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the state's participation in a qualifying regional survey, and the survey's sample design, data... designation based on submission of recreational survey data. 600.1417 Section 600.1417 Wildlife and Fisheries... Requirements for exempted state designation based on submission of recreational survey data. (a) To...

  5. Design of a linguistic feature space for quantitative color harmony judgment

    NASA Astrophysics Data System (ADS)

    Shen, Yu-Chuan; Chen, Yung-Sheng; Hsu, Wen Hsing

    1995-04-01

    The successful judgement of color harmony primarily depends on the determined features related to human's pleasure. In this paper, a new color feature of color linguistic distributions (CLD) is proposed upon a designed 1D image scale of 'CHEERFUL-SILENT'. This linguistic feature space is mainly designed by consisting with the color-difference of practical color vision. The CLD is described by a distance-based color linguistic quantization (DCLQ) algorithm, and is capable to indicate the fashion trends in Taiwan. Also, the grade of harmony can be measured based on the similarity of CLDs. Experiment of quantitative color harmony judgement demonstrate that the results based on CIE1976-LUV and CIE1976-LAB color spaces accomplish better consistency with those of questionnaire-based harmony judgement than the hue-dominated method.

  6. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  7. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  8. Edesign: Primer and Enhanced Internal Probe Design Tool for Quantitative PCR Experiments and Genotyping Assays

    PubMed Central

    Kasahara, Naoko; Delobel, Diane; Hanami, Takeshi; Tanaka, Yuki; de Hoon, Michiel J. L.; Hayashizaki, Yoshihide; Usui, Kengo; Harbers, Matthias

    2016-01-01

    Analytical PCR experiments preferably use internal probes for monitoring the amplification reaction and specific detection of the amplicon. Such internal probes have to be designed in close context with the amplification primers, and may require additional considerations for the detection of genetic variations. Here we describe Edesign, a new online and stand-alone tool for designing sets of PCR primers together with an internal probe for conducting quantitative real-time PCR (qPCR) and genotypic experiments. Edesign can be used for selecting standard DNA oligonucleotides like for instance TaqMan probes, but has been further extended with new functions and enhanced design features for Eprobes. Eprobes, with their single thiazole orange-labelled nucleotide, allow for highly sensitive genotypic assays because of their higher DNA binding affinity as compared to standard DNA oligonucleotides. Using new thermodynamic parameters, Edesign considers unique features of Eprobes during primer and probe design for establishing qPCR experiments and genotyping by melting curve analysis. Additional functions in Edesign allow probe design for effective discrimination between wild-type sequences and genetic variations either using standard DNA oligonucleotides or Eprobes. Edesign can be freely accessed online at http://www.dnaform.com/edesign2/, and the source code is available for download. PMID:26863543

  9. The Health Effects of Climate Change: A Survey of Recent Quantitative Research

    PubMed Central

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-01-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases. PMID:22754455

  10. The health effects of climate change: a survey of recent quantitative research.

    PubMed

    Grasso, Margherita; Manera, Matteo; Chiabai, Aline; Markandya, Anil

    2012-05-01

    In recent years there has been a large scientific and public debate on climate change and its direct as well as indirect effects on human health. In particular, a large amount of research on the effects of climate changes on human health has addressed two fundamental questions. First, can historical data be of some help in revealing how short-run or long-run climate variations affect the occurrence of infectious diseases? Second, is it possible to build more accurate quantitative models which are capable of predicting the future effects of different climate conditions on the transmissibility of particularly dangerous infectious diseases? The primary goal of this paper is to review the most relevant contributions which have directly tackled those questions, both with respect to the effects of climate changes on the diffusion of non-infectious and infectious diseases, with malaria as a case study. Specific attention will be drawn on the methodological aspects of each study, which will be classified according to the type of quantitative model considered, namely time series models, panel data and spatial models, and non-statistical approaches. Since many different disciplines and approaches are involved, a broader view is necessary in order to provide a better understanding of the interactions between climate and health. In this respect, our paper also presents a critical summary of the recent literature related to more general aspects of the impacts of climate changes on human health, such as: the economics of climate change; how to manage the health effects of climate change; the establishment of Early Warning Systems for infectious diseases. PMID:22754455

  11. Acoustical Surveys Of Methane Plumes By Using The Quantitative Echo Sounder In The Eastern Margin Of The Sea of Japan

    NASA Astrophysics Data System (ADS)

    Aoyama, C.; Matsumoto, R.; Okuda, Y.; Ishida, Y.; Hiruta, A.; Sunamura, M.; Numanami, H.; Tomaru, H.; Snyder, G.; Komatsubara, J.; Takeuchi, R.; Hiromatsu, M.; Aoyama, D.; Koike, Y.; Takeda, S.; Hayashi, T.; Hamada, H.

    2004-12-01

    The reseach and trainning/V, Umitaka-maru sailed to the methane seep area on a small ridge in the eastern margin of the Sea of Japan on July to August 2004 to survey the ocean floor gas hydrate and related acoustic signatures of methane plumes by using a quantitative echo sounder. Detailed bathymetric profiles have revealed a number of mounds, pockmarks and collapse structures within 3km x 4km on the ridge at the water depth of 910m to 980m. We mapped minutely methane plumes by using a quantitative echo sounder with positioning data from GPS. We also measured averaged echo intensity from the methane plumes both in every 100m range and every one minute by the echo integrator. We obtained the following results from the present echo-sounder survey. 1) We checked 36 plumes on echogram, ranging 100m to 200m in diameter and 600m to 700m in height, reaching up to 200m to 300m below sea level. 2) We measured the averaged volume backscattering strength (SV) of each methane plume. The strongest SV, -45dB, of the plumes was stronger than SV of fish school. 3) Averaged SV tend to show the highest values around the middle of plumes, whereas the SVs are relatively low at the bottom and the top of plumes. 4) Some of the plumes were observed to show daily fluctuation in height and width. 5) We recovered several fist-sized chunks of methane hydrate by piston coring at the area where we observed methane plumes. As a following up project, we are planning to measure SV of methane bubbles and methane hydrate floating in water columns through an experimental studies in a large water tanks.

  12. Applications of numerical optimization methods to helicopter design problems: A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  13. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1985-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  14. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  15. THE OPTICALLY UNBIASED GAMMA-RAY BURST HOST (TOUGH) SURVEY. I. SURVEY DESIGN AND CATALOGS

    SciTech Connect

    Hjorth, Jens; Malesani, Daniele; Fynbo, Johan P. U.; Kruehler, Thomas; Milvang-Jensen, Bo; Watson, Darach; Jakobsson, Pall; Schulze, Steve; Jaunsen, Andreas O.; Gorosabel, Javier; Levan, Andrew J.; Michalowski, Michal J.; Moller, Palle; Tanvir, Nial R.

    2012-09-10

    Long-duration gamma-ray bursts (GRBs) are powerful tracers of star-forming galaxies. We have defined a homogeneous subsample of 69 Swift GRB-selected galaxies spanning a very wide redshift range. Special attention has been devoted to making the sample optically unbiased through simple and well-defined selection criteria based on the high-energy properties of the bursts and their positions on the sky. Thanks to our extensive follow-up observations, this sample has now achieved a comparatively high degree of redshift completeness, and thus provides a legacy sample, useful for statistical studies of GRBs and their host galaxies. In this paper, we present the survey design and summarize the results of our observing program conducted at the ESO Very Large Telescope (VLT) aimed at obtaining the most basic properties of galaxies in this sample, including a catalog of R and K{sub s} magnitudes and redshifts. We detect the host galaxies for 80% of the GRBs in the sample, although only 42% have K{sub s} -band detections, which confirms that GRB-selected host galaxies are generally blue. The sample is not uniformly blue, however, with two extremely red objects detected. Moreover, galaxies hosting GRBs with no optical/NIR afterglows, whose identification therefore relies on X-ray localizations, are significantly brighter and redder than those with an optical/NIR afterglow. This supports a scenario where GRBs occurring in more massive and dusty galaxies frequently suffer high optical obscuration. Our spectroscopic campaign has resulted in 77% now having redshift measurements, with a median redshift of 2.14 {+-} 0.18. TOUGH alone includes 17 detected z > 2 Swift GRB host galaxies suitable for individual and statistical studies-a substantial increase over previous samples. Seven hosts have detections of the Ly{alpha} emission line and we can exclude an early indication that Ly{alpha} emission is ubiquitous among GRB hosts, but confirm that Ly{alpha} is stronger in GRB-selected galaxies than in flux-limited samples of Lyman break galaxies.

  16. THE IMACS CLUSTER BUILDING SURVEY. V. FURTHER EVIDENCE FOR STARBURST RECYCLING FROM QUANTITATIVE GALAXY MORPHOLOGIES

    SciTech Connect

    Abramson, Louis E.; Gladders, Michael D.; Dressler, Alan; Oemler, Augustus Jr.; Monson, Andrew; Persson, Eric; Poggianti, Bianca M.; Vulcani, Benedetta

    2013-11-10

    Using J- and K{sub s}-band imaging obtained as part of the IMACS Cluster Building Survey (ICBS), we measure Sérsic indices for 2160 field and cluster galaxies at 0.31 < z < 0.54. Using both mass- and magnitude-limited samples, we compare the distributions for spectroscopically determined passive, continuously star-forming, starburst, and post-starburst systems and show that previously established spatial and statistical connections between these types extend to their gross morphologies. Outside of cluster cores, we find close structural ties between starburst and continuously star-forming, as well as post-starburst and passive types, but not between starbursts and post-starbursts. These results independently support two conclusions presented in Paper II of this series: (1) most starbursts are the product of a non-disruptive triggering mechanism that is insensitive to global environment, such as minor mergers; (2) starbursts and post-starbursts generally represent transient phases in the lives of 'normal' star-forming and quiescent galaxies, respectively, originating from and returning to these systems in closed 'recycling' loops. In this picture, spectroscopically identified post-starbursts constitute a minority of all recently terminated starbursts, largely ruling out the typical starburst as a quenching event in all but the densest environments.

  17. Hydrological drought types in cold climates: quantitative analysis of causing factors and qualitative survey of impacts

    NASA Astrophysics Data System (ADS)

    Van Loon, A. F.; Ploum, S. W.; Parajka, J.; Fleig, A. K.; Garnier, E.; Laaha, G.; Van Lanen, H. A. J.

    2015-04-01

    For drought management and prediction, knowledge of causing factors and socio-economic impacts of hydrological droughts is crucial. Propagation of meteorological conditions in the hydrological cycle results in different hydrological drought types that require separate analysis. In addition to the existing hydrological drought typology, we here define two new drought types related to snow and ice. A snowmelt drought is a deficiency in the snowmelt discharge peak in spring in snow-influenced basins and a glaciermelt drought is a deficiency in the glaciermelt discharge peak in summer in glacierised basins. In 21 catchments in Austria and Norway we studied the meteorological conditions in the seasons preceding and at the time of snowmelt and glaciermelt drought events. Snowmelt droughts in Norway were mainly controlled by below-average winter precipitation, while in Austria both temperature and precipitation played a role. For glaciermelt droughts, the effect of below-average summer air temperature was dominant, both in Austria and Norway. Subsequently, we investigated the impacts of temperature-related drought types (i.e. snowmelt and glaciermelt drought, but also cold and warm snow season drought and rain-to-snow-season drought). In historical archives and drought databases for the US and Europe many impacts were found that can be attributed to these temperature-related hydrological drought types, mainly in the agriculture and electricity production (hydropower) sectors. However, drawing conclusions on the frequency of occurrence of different drought types from reported impacts is difficult, mainly because of reporting biases and the inevitably limited spatial and temporal scales of the information. Finally, this study shows that complete integration of quantitative analysis of causing factors and qualitative analysis of impacts of temperature-related droughts is not yet possible. Analysis of selected events, however, points out that it can be a promising research area if more data on drought impacts become available.

  18. Quantitative imaging of the human upper airway: instrument design and clinical studies

    NASA Astrophysics Data System (ADS)

    Leigh, M. S.; Armstrong, J. J.; Paduch, A.; Sampson, D. D.; Walsh, J. H.; Hillman, D. R.; Eastwood, P. R.

    2006-08-01

    Imaging of the human upper airway is widely used in medicine, in both clinical practice and research. Common imaging modalities include video endoscopy, X-ray CT, and MRI. However, no current modality is both quantitative and safe to use for extended periods of time. Such a capability would be particularly valuable for sleep research, which is inherently reliant on long observation sessions. We have developed an instrument capable of quantitative imaging of the human upper airway, based on endoscopic optical coherence tomography. There are no dose limits for optical techniques, and the minimally invasive imaging probe is safe for use in overnight studies. We report on the design of the instrument and its use in preliminary clinical studies, and we present results from a range of initial experiments. The experiments show that the instrument is capable of imaging during sleep, and that it can record dynamic changes in airway size and shape. This information is useful for research into sleep disorders, and potentially for clinical diagnosis and therapies.

  19. Probability of detection of nests and implications for survey design

    USGS Publications Warehouse

    Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.

    2009-01-01

    Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.

  20. The JCMT Gould Belt Survey: a quantitative comparison between SCUBA-2 data reduction methods

    NASA Astrophysics Data System (ADS)

    Mairs, S.; Johnstone, D.; Kirk, H.; Graves, S.; Buckle, J.; Beaulieu, S. F.; Berry, D. S.; Broekhoven-Fiene, H.; Currie, M. J.; Fich, M.; Hatchell, J.; Jenness, T.; Mottram, J. C.; Nutter, D.; Pattle, K.; Pineda, J. E.; Salji, C.; Francesco, J. Di; Hogerheijde, M. R.; Ward-Thompson, D.; JCMT Gould Belt survey Team

    2015-12-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artefacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software (STARLINK) but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth physical analyses of star-forming regions. Using the GBS LR1 method, we find that compact sources are recovered well, even at a peak brightness of only three times the noise, whereas the reconstruction of larger objects requires much care when drawing boundaries around the expected astronomical signal in the data reduction process. Incorrect boundaries can lead to false structure identification or it can cause structure to be missed. In the JCMT LR1 reduction, the extent of the true structure of objects larger than a point source is never fully recovered.

  1. Controls design with crossfeeds for hovering rotorcraft using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Biezad, Daniel J.; Cheng, Rendy

    1996-01-01

    A multi-input, multi-output controls design with dynamic crossfeed pre-compensation is presented for rotorcraft in near-hovering flight using Quantitative Feedback Theory (QFT). The resulting closed-loop control system bandwidth allows the rotorcraft to be considered for use as an inflight simulator. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets most handling qualities specifications relative to the decoupling of off-axis responses. Handling qualities are Level 1 for both low-gain tasks and high-gain tasks in the roll, pitch, and yaw axes except for the 10 deg/sec moderate-amplitude yaw command where the rotorcraft exhibits Level 2 handling qualities in the yaw axis caused by phase lag. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensators successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective. This is an area to be investigated in future research.

  2. Survey of design considerations for ventilating and air-conditioning systems in Hong Kong

    SciTech Connect

    Chow, W.K.; Fung, W.Y.

    1996-12-31

    The considerations made in designing ventilating and air-conditioning systems in Hong Kong are surveyed to get a general picture of the current practice. Questionnaires were distributed to building services engineering consultants, contractors, and operation firms to collect information related to the primary design objectives, design guides, and criteria selected; consideration of the airflow pattern in the ventilated space induced by the system; and the associated theories. An evaluation of the current design tools was also made in the survey. The information was then compiled, and it was found that improved methods for designing air distribution systems may be warranted.

  3. A survey of aerobraking orbital transfer vehicle design concepts

    NASA Technical Reports Server (NTRS)

    Park, Chul

    1987-01-01

    The five existing design concepts of the aerobraking orbital transfer vehicle (namely, the raked sphere-cone designs, conical lifting-brake, raked elliptic-cone, lifting-body, and ballute) are reviewed and critiqued. Historical backgrounds, and the geometrical, aerothermal, and operational features of these designs are reviewed first. Then, the technological requirements for the vehicle (namely, navigation, aerodynamic stability and control, afterbody flow impingement, nonequilibrium radiation, convective heat-transfer rates, mission abort and multiple atmospheric passes, transportation and construction, and the payload-to-vehicle weight requirements) are delineated by summarizing the recent advancements made on these issues. Each of the five designs are critiqued and rated on these issues. The highest and the lowest ratings are given to the raked sphere-cone and the ballute design, respectively.

  4. Survey design for lakes and reservoirs in the United States to assess contaminants in fish tissue

    EPA Science Inventory

    The National Lake Fish Tissue Study (NLFTS) was the first survey of fish contamination in lakes and reservoirs in the 48 conterminous states based on probability survey design. This study included the largest set (268) of persistent, bioaccumulative, and toxic (PBT) chemicals ev...

  5. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    ERIC Educational Resources Information Center

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions

  6. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    ERIC Educational Resources Information Center

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students' perceptions…

  7. National Comorbidity Survey Replication Adolescent Supplement (NCS-A): II. Overview and Design

    ERIC Educational Resources Information Center

    Kessler, Ronald C.; Avenevoli, Shelli; Costello, E. Jane; Green, Jennifer Greif; Gruber, Michael J.; Heeringa, Steven; Merikangas, Kathleen R.; Pennell, Beth-Ellen; Sampson, Nancy A.; Zaslavsky, Alan M.

    2009-01-01

    The national comorbidity survey that seeks to determine the prevalence and correlates of mental disorders among U.S. adolescents is based on a dual-frame design that includes 904 adolescents from a previous household survey and 9,244 adolescent students from a sample of 320 schools. Replacement schools for those that refuse to participate do not…

  8. Targeting Urban Watershed Stressor Gradients: Stream Survey Design, Ecological Responses, and Implications of Land Cover Resolution

    EPA Science Inventory

    We conducted a stream survey in the Narragansett Bay Watershed designed to target a gradient of development intensity, and to examine how associated changes in nutrients, carbon, and stressors affect periphyton and macroinvertebrates. Concentrations of nutrients, cations, and ani...

  9. Effect of survey design and catch rate estimation on total catch estimates in Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2012-01-01

    Roving–roving and roving–access creel surveys are the primary techniques used to obtain information on harvest of Chinook salmon Oncorhynchus tshawytscha in Idaho sport fisheries. Once interviews are conducted using roving–roving or roving–access survey designs, mean catch rate can be estimated with the ratio-of-means (ROM) estimator, the mean-of-ratios (MOR) estimator, or the MOR estimator with exclusion of short-duration (≤0.5 h) trips. Our objective was to examine the relative bias and precision of total catch estimates obtained from use of the two survey designs and three catch rate estimators for Idaho Chinook salmon fisheries. Information on angling populations was obtained by direct visual observation of portions of Chinook salmon fisheries in three Idaho river systems over an 18-d period. Based on data from the angling populations, Monte Carlo simulations were performed to evaluate the properties of the catch rate estimators and survey designs. Among the three estimators, the ROM estimator provided the most accurate and precise estimates of mean catch rate and total catch for both roving–roving and roving–access surveys. On average, the root mean square error of simulated total catch estimates was 1.42 times greater and relative bias was 160.13 times greater for roving–roving surveys than for roving–access surveys. Length-of-stay bias and nonstationary catch rates in roving–roving surveys both appeared to affect catch rate and total catch estimates. Our results suggest that use of the ROM estimator in combination with an estimate of angler effort provided the least biased and most precise estimates of total catch for both survey designs. However, roving–access surveys were more accurate than roving–roving surveys for Chinook salmon fisheries in Idaho.

  10. Preliminary design of the Kunlun Dark Universe Survey Telescope (KDUST)

    NASA Astrophysics Data System (ADS)

    Yuan, Xiangyan; Cui, Xiangqun; Su, Ding-qiang; Zhu, Yongtian; Wang, Lifan; Gu, Bozhong; Gong, Xuefei; Li, Xinnan

    2013-01-01

    From theoretical analysis and site testing work for 4 years on Dome A, Antarctica, we can reasonably predict that it is a very good astronomical site, as good as or even better than Dome C and suitable for observations ranging from optical to infrared & sub-mm wavelengths. After the Chinese Small Telescope ARray (CSTAR), which was composed of four small fixed telescopes with diameter of 145mm and the three Antarctic Survey Telescopes (AST3) with 500mm entrance diameter, the Kunlun Dark Universe Survey Telescope (KDUST) with diameter of 2.5m is proposed. KDUST will adopt an innovative optical system which can deliver very good image quality over a 2 square degree flat field of view. Some other features are: a fixed focus suitable for different instruments, active optics for miscollimation correction, a lens-prisms that can be used as an atmospheric dispersion corrector or as a very low-dispersion spectrometer when moved in / out of the main optical path without changing the performance of the system, and a compact structure to make easier transportation to Dome A. KDUST will be mounted on a tower with height 15m in order to make a full use of the superb free atmospheric seeing.

  11. Systematic review of effects of current transtibial prosthetic socket designs--Part 2: Quantitative outcomes.

    PubMed

    Safari, Mohammad Reza; Meier, Margrit Regula

    2015-01-01

    This review is an attempt to untangle the complexity of transtibial prosthetic socket fit and perhaps find some indication of whether a particular prosthetic socket type might be best for a given situation. In addition, we identified knowledge gaps, thus providing direction for possible future research. We followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines, using medical subject headings and standard key words to search for articles in relevant databases. No restrictions were made on study design and type of outcome measure used. From the obtained search results (n = 1,863), 35 articles were included. The relevant data were entered into a predefined data form that included the Downs and Black risk of bias assessment checklist. This article presents the results from the systematic review of the quantitative outcomes (n = 27 articles). Trends indicate that vacuum-assisted suction sockets improve gait symmetry, volume control, and residual limb health more than other socket designs. Hydrostatic sockets seem to create less inconsistent socket fittings, reducing a problem that greatly influences outcome measures. Knowledge gaps exist in the understanding of clinically meaningful changes in socket fit and its effect on biomechanical outcomes. Further, safe and comfortable pressure thresholds under various conditions should be determined through a systematic approach. PMID:26436733

  12. First National Survey of Lead and Allergens in Housing: survey design and methods for the allergen and endotoxin components.

    PubMed Central

    Vojta, Patrick J; Friedman, Warren; Marker, David A; Clickner, Robert; Rogers, John W; Viet, Susan M; Muilenberg, Michael L; Thorne, Peter S; Arbes, Samuel J; Zeldin, Darryl C

    2002-01-01

    From July 1998 to August 1999, the U.S. Department of Housing and Urban Development and the National Institute of Environmental Health Sciences conducted the first National Survey of Lead and Allergens in Housing. The purpose of the survey was to assess children's potential household exposure to lead, allergens, and bacterial endotoxins. We surveyed a sample of 831 homes, representing 96 million permanently occupied, noninstitutional housing units that permit resident children. We administered questionnaires to household members, made home observations, and took environmental samples. This article provides general background information on the survey, an overview of the survey design, and a description of the data collection and laboratory methods pertaining to the allergen and endotoxin components. We collected dust samples from a bed, the bedroom floor, a sofa or chair, the living room floor, the kitchen floor, and a basement floor and analyzed them for cockroach allergen Bla g 1, the dust mite allergens Der f 1 and Der p 1, the cat allergen Fel d 1, the dog allergen Can f 1, the rodent allergens Rat n 1 and mouse urinary protein, allergens of the fungus Alternaria alternata, and endotoxin. This article provides the essential context for subsequent reports that will describe the prevalence of allergens and endotoxin in U.S. households, their distribution by various housing characteristics, and their associations with allergic diseases such as asthma and rhinitis. PMID:12003758

  13. First National Survey of Lead and Allergens in Housing: survey design and methods for the allergen and endotoxin components.

    PubMed

    Vojta, Patrick J; Friedman, Warren; Marker, David A; Clickner, Robert; Rogers, John W; Viet, Susan M; Muilenberg, Michael L; Thorne, Peter S; Arbes, Samuel J; Zeldin, Darryl C

    2002-05-01

    From July 1998 to August 1999, the U.S. Department of Housing and Urban Development and the National Institute of Environmental Health Sciences conducted the first National Survey of Lead and Allergens in Housing. The purpose of the survey was to assess children's potential household exposure to lead, allergens, and bacterial endotoxins. We surveyed a sample of 831 homes, representing 96 million permanently occupied, noninstitutional housing units that permit resident children. We administered questionnaires to household members, made home observations, and took environmental samples. This article provides general background information on the survey, an overview of the survey design, and a description of the data collection and laboratory methods pertaining to the allergen and endotoxin components. We collected dust samples from a bed, the bedroom floor, a sofa or chair, the living room floor, the kitchen floor, and a basement floor and analyzed them for cockroach allergen Bla g 1, the dust mite allergens Der f 1 and Der p 1, the cat allergen Fel d 1, the dog allergen Can f 1, the rodent allergens Rat n 1 and mouse urinary protein, allergens of the fungus Alternaria alternata, and endotoxin. This article provides the essential context for subsequent reports that will describe the prevalence of allergens and endotoxin in U.S. households, their distribution by various housing characteristics, and their associations with allergic diseases such as asthma and rhinitis. PMID:12003758

  14. The HETDEX Pilot Survey. I. Survey Design, Performance, and Catalog of Emission-line Galaxies

    NASA Astrophysics Data System (ADS)

    Adams, Joshua J.; Blanc, Guillermo A.; Hill, Gary J.; Gebhardt, Karl; Drory, Niv; Hao, Lei; Bender, Ralf; Byun, Joyce; Ciardullo, Robin; Cornell, Mark E.; Finkelstein, Steven L.; Fry, Alex; Gawiser, Eric; Gronwall, Caryl; Hopp, Ulrich; Jeong, Donghui; Kelz, Andreas; Kelzenberg, Ralf; Komatsu, Eiichiro; MacQueen, Phillip J.; Murphy, Jeremy; Odoms, P. Samuel; Roth, Martin; Schneider, Donald P.; Tufts, Joseph R.; Wilkinson, Christopher P.

    2011-01-01

    We present a catalog of emission-line galaxies selected solely by their emission-line fluxes using a wide-field integral field spectrograph. This work is partially motivated as a pilot survey for the upcoming Hobby-Eberly Telescope Dark Energy Experiment. We describe the observations, reductions, detections, redshift classifications, line fluxes, and counterpart information for 397 emission-line galaxies detected over 169 squ' with a 3500-5800 Å bandpass under 5 Å full-width-half-maximum (FWHM) spectral resolution. The survey's best sensitivity for unresolved objects under photometric conditions is between 4 and 20× 10-17 erg s-1 cm-2 depending on the wavelength, and Lyα luminosities between 3 × 1042 and 6 × 1042 erg s-1 are detectable. This survey method complements narrowband and color-selection techniques in the search of high-redshift galaxies with its different selection properties and large volume probed. The four survey fields within the COSMOS, GOODS-N, MUNICS, and XMM-LSS areas are rich with existing, complementary data. We find 105 galaxies via their high-redshift Lyα emission at 1.9 < z < 3.8, and the majority of the remainder objects are low-redshift [O II]3727 emitters at z < 0.56. The classification between low- and high-redshift objects depends on rest-frame equivalent width (EW), as well as other indicators, where available. Based on matches to X-ray catalogs, the active galactic nuclei fraction among the Lyα emitters is 6%. We also analyze the survey's completeness and contamination properties through simulations. We find five high-z, highly significant, resolved objects with FWHM sizes >44 squ' which appear to be extended Lyα nebulae. We also find three high-z objects with rest-frame Lyα EW above the level believed to be achievable with normal star formation, EW0>240 Å. Future papers will investigate the physical properties of this sample. This paper includes data taken at The McDonald Observatory of The University of Texas at Austin.

  15. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  16. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  17. Survey of sodium removal methods: LMFBR conceptual design study, Phase 3

    SciTech Connect

    1981-09-01

    At the project design review of the nuclear island maintenance on May 5, 1981, DOE requested a survey of current sodium cleaning methods and facilities. Stone & Webster provided a plan and schedule for providing this survey. This plan was approved by Boeing Engineering and Construction Company. The purpose of this survey is to document the sodium removal technology and experience as it relates to the CDS Large Developmental Plant, summarize the information, and provide a prospective for the CDS project. The recommendations generated are intended to provide input for a design and layout review of the Nuclear Island Maintenance Building (NIMB).

  18. THE BRIGHTEST OF REIONIZING GALAXIES SURVEY: DESIGN AND PRELIMINARY RESULTS

    SciTech Connect

    Trenti, M.; Bradley, L. D.; Stiavelli, M.; MacKenty, J. W.; Oesch, P.; Carollo, C. M.; Treu, T.; Bouwens, R. J.; Illingworth, G. D.; Shull, J. M.

    2011-02-01

    We present the first results on the search for very bright (M{sub AB} {approx} -21) galaxies at redshift z {approx} 8 from the Brightest of Reionizing Galaxies (BoRG) survey. BoRG is a Hubble Space Telescope Wide Field Camera 3 (WFC3) pure-parallel survey that is obtaining images on random lines of sight at high Galactic latitudes in four filters (F606W, F098M, F125W, and F160W), with integration times optimized to identify galaxies at z {approx}> 7.5 as F098M dropouts. We discuss here results from a search area of approximately 130 arcmin{sup 2} over 23 BoRG fields, complemented by six other pure-parallel WFC3 fields with similar filters. This new search area is more than two times wider than previous WFC3 observations at z {approx} 8. We identify four F098M-dropout candidates with high statistical confidence (detected at greater than 8{sigma} confidence in F125W). These sources are among the brightest candidates currently known at z {approx} 8 and approximately 10 times brighter than the z = 8.56 galaxy UDFy-38135539. They thus represent ideal targets for spectroscopic follow-up observations and could potentially lead to a redshift record, as our color selection includes objects up to z {approx} 9. However, the expected contamination rate of our sample is about 30% higher than typical searches for dropout galaxies in legacy fields, such as the GOODS and HUDF, where deeper data and additional optical filters are available to reject contaminants.

  19. The Brightest of Reionizing Galaxies Survey: Design and Preliminary Results

    NASA Astrophysics Data System (ADS)

    Trenti, M.; Bradley, L. D.; Stiavelli, M.; Oesch, P.; Treu, T.; Bouwens, R. J.; Shull, J. M.; MacKenty, J. W.; Carollo, C. M.; Illingworth, G. D.

    2011-02-01

    We present the first results on the search for very bright (M AB ? -21) galaxies at redshift z ~ 8 from the Brightest of Reionizing Galaxies (BoRG) survey. BoRG is a Hubble Space Telescope Wide Field Camera 3 (WFC3) pure-parallel survey that is obtaining images on random lines of sight at high Galactic latitudes in four filters (F606W, F098M, F125W, and F160W), with integration times optimized to identify galaxies at z >~ 7.5 as F098M dropouts. We discuss here results from a search area of approximately 130 arcmin2 over 23 BoRG fields, complemented by six other pure-parallel WFC3 fields with similar filters. This new search area is more than two times wider than previous WFC3 observations at z ~ 8. We identify four F098M-dropout candidates with high statistical confidence (detected at greater than 8? confidence in F125W). These sources are among the brightest candidates currently known at z ~ 8 and approximately 10 times brighter than the z = 8.56 galaxy UDFy-38135539. They thus represent ideal targets for spectroscopic follow-up observations and could potentially lead to a redshift record, as our color selection includes objects up to z ~ 9. However, the expected contamination rate of our sample is about 30% higher than typical searches for dropout galaxies in legacy fields, such as the GOODS and HUDF, where deeper data and additional optical filters are available to reject contaminants. Based on observations made with the NASA/ESA Hubble Space Telescope, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555. These observations are associated with Programs 11700, 11702.

  20. Object-oriented programming in control system design: a survey

    SciTech Connect

    Jobling, C.P.; Grant, P.W.; Barker, H.A.; Townsend, P.

    1994-08-01

    The object-oriented paradigm shows great potential as a means for revolutionizing software development. In the last decade, much research has been directed towards the development of design methods, languages, environments, reusable libraries of software components and database systems to support this paradigm. The first part of the paper presents the terminology of the object-oriented paradigm, reviews the state-of-the-art in object-oriented programming and discusses class libraries and object-oriented design. The second part of the paper discusses its application in the area of computer-aided control system design. It is argued that the adoption of these ideas will increase greatly the productivity of software developers in this field and improve the facilities that will be offered to users. 215 refs.

  1. Practical aspects of applied optimized survey design for electrical resistivity tomography

    NASA Astrophysics Data System (ADS)

    Wilkinson, Paul B.; Loke, Meng Heng; Meldrum, Philip I.; Chambers, Jonathan E.; Kuras, Oliver; Gunn, David A.; Ogilvy, Richard D.

    2012-04-01

    The use of optimized resistivity tomography surveys to acquire field data imposes extra constraints on the design strategy beyond maximizing the quality of the resulting tomographic image. In this paper, methods are presented to (1) minimize electrode polarization effects (2) make efficient use of parallel measurement channels and (3) incorporate data noise estimates in the optimization process. (1) A simulated annealing algorithm is used to rearrange the optimized measurement sequences to minimize polarization errors. The method is developed using random survey designs and is demonstrated to be effective for use with single and multichannel optimized surveys. (2) An optimization algorithm is developed to design surveys by successive addition of multichannel groups of measurements rather than individual electrode configurations. The multichannel surveys are shown to produce results nearly as close to optimal as equivalent single channel surveys, while reducing data collection times by an order of magnitude. (3) Random errors in the data are accounted for by weighting the electrode configurations in the optimization process according to a simple error model incorporating background and voltage-dependent noise. The use of data weighting produces optimized surveys that are more robust in the presence of noise, while maintaining as much of the image resolution of the noise-free designs as possible. All the new methods described in this paper are demonstrated using both synthetic and real data, the latter having been measured on an active landslide using a permanently installed geoelectrical monitoring system.

  2. Design study of the deepsky ultraviolet survey telescope. [Spacelab payload

    NASA Technical Reports Server (NTRS)

    Page, N. A.; Callaghan, F. G.; Killen, R. H.; Willis, W.

    1977-01-01

    Preliminary mechanical design and specifications are presented for a wide field ultraviolet telescope and detector to be carried as a Spacelab payload. Topics discussed include support structure stiffness (torsional and bending), mirror assembly, thermal control, optical alignment, attachment to the instrument pointing pallet, control and display, power requirements, acceptance and qualification test plans, cost analysis and scheduling. Drawings are included.

  3. Net Survey: "Top Ten Mistakes" in Academic Web Design.

    ERIC Educational Resources Information Center

    Petrik, Paula

    2000-01-01

    Highlights the top ten mistakes in academic Web design: (1) bloated graphics; (2) scaling images; (3) dense text; (4) lack of contrast; (5) font size; (6) looping animations; (7) courseware authoring software; (8) scrolling/long pages; (9) excessive download; and (10) the nothing site. Includes resources. (CMK)

  4. Using GIS to generate spatially balanced random survey designs for natural resource applications.

    PubMed

    Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B

    2007-07-01

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design. PMID:17546523

  5. On standard and optimal designs of industrial-scale 2-D seismic surveys

    NASA Astrophysics Data System (ADS)

    Guest, T.; Curtis, A.

    2011-08-01

    The principal aim of performing a survey or experiment is to maximize the desired information within a data set by minimizing the post-survey uncertainty on the ranges of the model parameter values. Using Bayesian, non-linear, statistical experimental design (SED) methods we show how industrial scale amplitude variations with offset (AVO) surveys can be constructed to maximize the information content contained in AVO crossplots, the principal source of petrophysical information from seismic surveys. The design method allows offset dependent errors, previously not allowed in non-linear geoscientific SED methods. The method is applied to a single common-midpoint gather. The results show that the optimal design is highly dependent on the ranges of the model parameter values when a low number of receivers is being used, but that a single optimal design exists for the complete range of parameters once the number of receivers is increased above a threshold value. However, when acquisition and processing costs are considered we find that a design with constant spatial receiver separation survey becomes close to optimal. This explains why regularly-spaced, 2-D seismic surveys have performed so well historically, not only from the point of view of noise attenuation and imaging in which homogeneous data coverage confers distinct advantages, but also to provide data to constrain subsurface petrophysical information.

  6. Quantitative fault tolerant control design for a hydraulic actuator with a leaking piston seal

    NASA Astrophysics Data System (ADS)

    Karpenko, Mark

    Hydraulic actuators are complex fluid power devices whose performance can be degraded in the presence of system faults. In this thesis a linear, fixed-gain, fault tolerant controller is designed that can maintain the positioning performance of an electrohydraulic actuator operating under load with a leaking piston seal and in the presence of parametric uncertainties. Developing a control system tolerant to this class of internal leakage fault is important since a leaking piston seal can be difficult to detect, unless the actuator is disassembled. The designed fault tolerant control law is of low-order, uses only the actuator position as feedback, and can: (i) accommodate nonlinearities in the hydraulic functions, (ii) maintain robustness against typical uncertainties in the hydraulic system parameters, and (iii) keep the positioning performance of the actuator within prescribed tolerances despite an internal leakage fault that can bypass up to 40% of the rated servovalve flow across the actuator piston. Experimental tests verify the functionality of the fault tolerant control under normal and faulty operating conditions. The fault tolerant controller is synthesized based on linear time-invariant equivalent (LTIE) models of the hydraulic actuator using the quantitative feedback theory (QFT) design technique. A numerical approach for identifying LTIE frequency response functions of hydraulic actuators from acceptable input-output responses is developed so that linearizing the hydraulic functions can be avoided. The proposed approach can properly identify the features of the hydraulic actuator frequency response that are important for control system design and requires no prior knowledge about the asymptotic behavior or structure of the LTIE transfer functions. A distributed hardware-in-the-loop (HIL) simulation architecture is constructed that enables the performance of the proposed fault tolerant control law to be further substantiated, under realistic operating conditions. Using the HIL framework, the fault tolerant hydraulic actuator is operated as a flight control actuator against the real-time numerical simulation of a high-performance jet aircraft. A robust electrohydraulic loading system is also designed using QFT so that the in-flight aerodynamic load can be experimentally replicated. The results of the HIL experiments show that using the fault tolerant controller to compensate the internal leakage fault at the actuator level can benefit the flight performance of the airplane.

  7. Multilevel analysis of women's empowerment and HIV prevention: quantitative survey Results from a preliminary study in Botswana.

    PubMed

    Greig, Fiona E; Koopman, Cheryl

    2003-06-01

    This preliminary study explored relationships between women's empowerment and HIV prevention on the national and individual level with a focus on Botswana. Among sub-Saharan Africa countries, HIV prevalence was positively correlated with indirect indicators of women's empowerment relating to their education (female enrollment in secondary education and ratio of female to male secondary school enrollment), but not to their economic status (female share of paid employment in industry and services) or political status (women's share of seats in national parliament), while controlling for gross national income, percentage of births attended, and percentage of roads paved. Condom use at last sexual encounter was positively and significantly correlated with both indicators of women's educational empowerment, but was not significantly related to the other two indices. Empowerment at the individual level was explored through a preliminary quantitative survey of 71 sexually active women in Gaborone, Botswana, that was conducted in February 2001. Regression analyses showed that women's negotiating power and economic independence were the factors most strongly related to condom use, and did not show that education was a crucial factor. Economic independence was the factor most strongly related to negotiating power. These results suggest that in Botswana, HIV prevention efforts may need to improve women's negotiating skills and access to income-generating activities. PMID:14586204

  8. Improved Optical Design for the Large Synoptic Survey Telescope (LSST)

    SciTech Connect

    Seppala, L

    2002-09-24

    This paper presents an improved optical design for the LSST, an fll.25 three-mirror telescope covering 3.0 degrees full field angle, with 6.9 m effective aperture diameter. The telescope operates at five wavelength bands spanning 386.5 nm to 1040 nm (B, V, R, I and Z). For all bands, 80% of the polychromatic diffracted energy is collected within 0.20 arc-seconds diameter. The reflective telescope uses an 8.4 m f/1.06 concave primary, a 3.4 m convex secondary and a 5.2 m concave tertiary in a Paul geometry. The system length is 9.2 m. A refractive corrector near the detector uses three fused silica lenses, rather than the two lenses of previous designs. Earlier designs required that one element be a vacuum barrier, but now the detector sits in an inert gas at ambient pressure. The last lens is the gas barrier. Small adjustments lead to optimal correction at each band. The filters have different axial thicknesses. The primary and tertiary mirrors are repositioned for each wavelength band. The new optical design incorporates features to simplify manufacturing. They include a flat detector, a far less aspheric convex secondary (10 {micro}m from best fit sphere) and reduced aspheric departures on the lenses and tertiary mirror. Five aspheric surfaces, on all three mirrors and on two lenses, are used. The primary is nearly parabolic. The telescope is fully baffled so that no specularly reflected light from any field angle, inside or outside of the full field angle of 3.0 degrees, can reach the detector.

  9. Using simulation to evaluate wildlife survey designs: polar bears and seals in the Chukchi Sea.

    PubMed

    Conn, Paul B; Moreland, Erin E; Regehr, Eric V; Richmond, Erin L; Cameron, Michael F; Boveng, Peter L

    2016-01-01

    Logistically demanding and expensive wildlife surveys should ideally yield defensible estimates. Here, we show how simulation can be used to evaluate alternative survey designs for estimating wildlife abundance. Specifically, we evaluate the potential of instrument-based aerial surveys (combining infrared imagery with high-resolution digital photography to detect and identify species) for estimating abundance of polar bears and seals in the Chukchi Sea. We investigate the consequences of different levels of survey effort, flight track allocation and model configuration on bias and precision of abundance estimators. For bearded seals (0.07 animals km(-2)) and ringed seals (1.29 animals km(-2)), we find that eight flights traversing ≈7840 km are sufficient to achieve target precision levels (coefficient of variation (CV)<20%) for a 2.94×10(5) km(2) study area. For polar bears (provisionally, 0.003 animals km(-2)), 12 flights traversing ≈11 760 km resulted in CVs ranging from 28 to 35%. Estimators were relatively unbiased with similar precision over different flight track allocation strategies and estimation models, although some combinations had superior performance. These findings suggest that instrument-based aerial surveys may provide a viable means for monitoring seal and polar bear populations on the surface of the sea ice over large Arctic regions. More broadly, our simulation-based approach to evaluating survey designs can serve as a template for biologists designing their own surveys. PMID:26909183

  10. Using simulation to evaluate wildlife survey designs: polar bears and seals in the Chukchi Sea

    PubMed Central

    Conn, Paul B.; Moreland, Erin E.; Regehr, Eric V.; Richmond, Erin L.; Cameron, Michael F.; Boveng, Peter L.

    2016-01-01

    Logistically demanding and expensive wildlife surveys should ideally yield defensible estimates. Here, we show how simulation can be used to evaluate alternative survey designs for estimating wildlife abundance. Specifically, we evaluate the potential of instrument-based aerial surveys (combining infrared imagery with high-resolution digital photography to detect and identify species) for estimating abundance of polar bears and seals in the Chukchi Sea. We investigate the consequences of different levels of survey effort, flight track allocation and model configuration on bias and precision of abundance estimators. For bearded seals (0.07 animals km−2) and ringed seals (1.29 animals km−2), we find that eight flights traversing ≈7840 km are sufficient to achieve target precision levels (coefficient of variation (CV)<20%) for a 2.94×105 km2 study area. For polar bears (provisionally, 0.003 animals km−2), 12 flights traversing ≈11 760 km resulted in CVs ranging from 28 to 35%. Estimators were relatively unbiased with similar precision over different flight track allocation strategies and estimation models, although some combinations had superior performance. These findings suggest that instrument-based aerial surveys may provide a viable means for monitoring seal and polar bear populations on the surface of the sea ice over large Arctic regions. More broadly, our simulation-based approach to evaluating survey designs can serve as a template for biologists designing their own surveys. PMID:26909183

  11. Sample and design considerations in post-disaster mental health needs assessment tracking surveys

    PubMed Central

    Kessler, Ronald C.; Keane, Terence M.; Ursano, Robert J.; Mokdad, Ali; Zaslavsky, Alan M.

    2009-01-01

    Although needs assessment surveys are carried out after many large natural and man-made disasters, synthesis of findings across these surveys and disaster situations about patterns and correlates of need is hampered by inconsistencies in study designs and measures. Recognizing this problem, the US Substance Abuse and Mental Health Services Administration (SAMHSA) assembled a task force in 2004 to develop a model study design and interview schedule for use in post-disaster needs assessment surveys. The US National Institute of Mental Health subsequently approved a plan to establish a center to implement post-disaster mental health needs assessment surveys in the future using an integrated series of measures and designs of the sort proposed by the SAMHSA task force. A wide range of measurement, design, and analysis issues will arise in developing this center. Given that the least widely discussed of these issues concerns study design, the current report focuses on the most important sampling and design issues proposed for this center based on our experiences with the SAMHSA task force, subsequent Katrina surveys, and earlier work in other disaster situations. PMID:19035440

  12. Modified Universal Design Survey: Enhancing Operability of Launch Vehicle Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for next generation space launch vehicles. Launch site ground operations include numerous operator tasks to prepare the vehicle for launch or to perform preflight maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To promote operability, a Design Quality Evaluation Survey based on Universal Design framework was developed to support Human Factors Engineering (HFE) evaluation for NASA s launch vehicles. Universal Design per se is not a priority for launch vehicle processing however; applying principles of Universal Design will increase the probability of an error free and efficient design which promotes operability. The Design Quality Evaluation Survey incorporates and tailors the seven Universal Design Principles and adds new measures for Safety and Efficiency. Adapting an approach proven to measure Universal Design Performance in Product, each principle is associated with multiple performance measures which are rated with the degree to which the statement is true. The Design Quality Evaluation Survey was employed for several launch vehicle ground processing worksite analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability.

  13. Laboratory design and test procedures for quantitative evaluation of infrared sensors to assess thermal anomalies

    SciTech Connect

    Chang, Y.M.; Grot, R.A.; Wood, J.T.

    1985-06-01

    This report presents the description of the laboratory apparatus and preliminary results of the quantitative evaluation of three high-resolution and two low-resolution infrared imaging systems. These systems which are commonly used for building diagnostics are tested under various background temperatures (from -20/sup 0/C to 25/sup 0/C) for their minimum resolvable temperature differences (MRTD) at spatial frequencies from 0.03 to 0.25 cycles per milliradian. The calibration curves of absolute and differential temperature measurements are obtained for three systems. The signal transfer function and line spread function at ambient temperature of another three systems are also measured. Comparisons of the dependence of the MRTD on background temperatures from the measured data with the predicted values given in ASHRAE Standards 101-83 are also included. The dependence of background temperatures for absolute temperature measurements are presented, as well as comparison of measured data and data given by the manufacturer. Horizontal on-axis magnification factors of the geometric transfer function of two systems are also established to calibrate the horizontal axis for the measured line spread function to obtain the modulation transfer function. The variation of the uniformity for horizontal display of these two sensors are also observed. Included are detailed descriptions of laboratory design, equipment setup, and evaluation procedures of each test. 10 refs., 38 figs., 12 tabs.

  14. HomoSAR: bridging comparative protein modeling with quantitative structural activity relationship to design new peptides.

    PubMed

    Borkar, Mahesh R; Pissurlenkar, Raghuvir R S; Coutinho, Evans C

    2013-11-15

    Peptides play significant roles in the biological world. To optimize activity for a specific therapeutic target, peptide library synthesis is inevitable; which is a time consuming and expensive. Computational approaches provide a promising way to simply elucidate the structural basis in the design of new peptides. Earlier, we proposed a novel methodology termed HomoSAR to gain insight into the structure activity relationships underlying peptides. Based on an integrated approach, HomoSAR uses the principles of homology modeling in conjunction with the quantitative structural activity relationship formalism to predict and design new peptide sequences with the optimum activity. In the present study, we establish that the HomoSAR methodology can be universally applied to all classes of peptides irrespective of sequence length by studying HomoSAR on three peptide datasets viz., angiotensin-converting enzyme inhibitory peptides, CAMEL-s antibiotic peptides, and hAmphiphysin-1 SH3 domain binding peptides, using a set of descriptors related to the hydrophobic, steric, and electronic properties of the 20 natural amino acids. Models generated for all three datasets have statistically significant correlation coefficients (r(2)) and predictive r2 (r(pred)2) and cross validated coefficient ( q(LOO)2). The daintiness of this technique lies in its simplicity and ability to extract all the information contained in the peptides to elucidate the underlying structure activity relationships. The difficulties of correlating both sequence diversity and variation in length of the peptides with their biological activity can be addressed. The study has been able to identify the preferred or detrimental nature of amino acids at specific positions in the peptide sequences. PMID:24105965

  15. Quantitative clinical nonpulsatile and localized visible light oximeter: design of the T-Stat tissue oximeter

    NASA Astrophysics Data System (ADS)

    Benaron, David A.; Parachikov, Ilian H.; Cheong, Wai-Fung; Friedland, Shai; Duckworth, Joshua L.; Otten, David M.; Rubinsky, Boris E.; Horchner, Uwe B.; Kermit, Eben L.; Liu, Frank W.; Levinson, Carl J.; Murphy, Aileen L.; Price, John W.; Talmi, Yair; Weersing, James P.

    2003-07-01

    We report the development of a general, quantitative, and localized visible light clinical tissue oximeter, sensitive to both hypoxemia and ischemia. Monitor design and operation were optimized over four instrument generations. A range of clinical probes were developed, including non-contact wands, invasive catheters, and penetrating needles with injection ports. Real-time data were collected (a) from probes, standards, and reference solutions to optimize each component, (b) from ex vivo hemoglobin solutions co-analyzed for StO2% and pO2 during deoxygenation, and (c) from normoxic human subject skin and mucosal tissue surfaces. Results show that (a) differential spectroscopy allows extraction of features with minimization of the effects of scattering, (b) in vitro oximetry produces a hemoglobin saturation binding curve of expected sigmoid shape and values, and (c) that monitoring human tissues allows real-time tissue spectroscopic features to be monitored. Unlike with near-infrared (NIRS) or pulse oximetry (SpO2%) methods, we found non-pulsatile, diffusion-based tissue oximetry (StO2%) to work most reliably for non-contact reflectance monitoring and for invasive catheter- or needle-based monitoring, using blue to orange light (475-600 nm). Measured values were insensitive to motion artifact. Down time was non-existent. We conclude that the T-Stat oximeter design is suitable for the collection of spectroscopic data from human subjects, and that the oximeter may have application in the monitoring of regional hemoglobin oxygen saturation in the capillary tissue spaces of human subjects.

  16. Hit by a Perfect Storm? Art & Design in the National Student Survey

    ERIC Educational Resources Information Center

    Yorke, Mantz; Orr, Susan; Blair, Bernadette

    2014-01-01

    There has long been the suspicion amongst staff in Art & Design that the ratings given to their subject disciplines in the UK's National Student Survey are adversely affected by a combination of circumstances--a "perfect storm". The "perfect storm" proposition is tested by comparing ratings for Art & Design with…

  17. Usability Evaluation Survey for Identifying Design Issues in Civil Flight Deck

    NASA Astrophysics Data System (ADS)

    Ozve Aminian, Negin; Izzuddin Romli, Fairuz; Wiriadidjaja, Surjatin

    2016-02-01

    Ergonomics assessment for cockpit in civil aircraft is important as the pilots spend most of their time during flight on the seating posture imposed by its design. The improper seat design can cause discomfort and pain, which will disturb the pilot's concentration in flight. From a conducted survey, it is found that there are some issues regarding the current cockpit design. This study aims to highlight potential mismatches between the current cockpit design and the ergonomic design recommendations for anthropometric dimensions and seat design, which could be the roots of the problems faced by the pilots in the cockpit.

  18. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    USGS Publications Warehouse

    Edwards, T.C., Jr.; Cutler, D.R.; Zimmermann, N.E.; Geiser, L.; Moisen, G.G.

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by resubstitution rates were similar for each lichen species irrespective of the underlying sample survey form. Cross-validation estimates of prediction accuracies were lower than resubstitution accuracies for all species and both design types, and in all cases were closer to the true prediction accuracies based on the EVALUATION data set. We argue that greater emphasis should be placed on calculating and reporting cross-validation accuracy rates rather than simple resubstitution accuracy rates. Evaluation of the DESIGN and PURPOSIVE tree models on the EVALUATION data set shows significantly lower prediction accuracy for the PURPOSIVE tree models relative to the DESIGN models, indicating that non-probabilistic sample surveys may generate models with limited predictive capability. These differences were consistent across all four lichen species, with 11 of the 12 possible species and sample survey type comparisons having significantly lower accuracy rates. Some differences in accuracy were as large as 50%. The classification tree structures also differed considerably both among and within the modelled species, depending on the sample survey form. Overlap in the predictor variables selected by the DESIGN and PURPOSIVE tree models ranged from only 20% to 38%, indicating the classification trees fit the two evaluated survey forms on different sets of predictor variables. The magnitude of these differences in predictor variables throws doubt on ecological interpretation derived from prediction models based on non-probabilistic sample surveys. ?? 2006 Elsevier B.V. All rights reserved.

  19. Wide Field Infrared Survey Telescope [WFIRST]: telescope design and simulated performance

    NASA Astrophysics Data System (ADS)

    Goullioud, R.; Content, D. A.; Kuan, G. M.; Moore, J. D.; Chang, Z.; Sunada, E. T.; Villalvazo, J.; Hawk, J. P.; Armani, N. V.; Johnson, E. L.; Powell, C. A.

    2012-09-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission concept was ranked first in new space astrophysics missions by the Astro2010 Decadal Survey, incorporating the Joint Dark Energy Mission payload concept and multiple science white papers. This mission is based on a space telescope at L2 studying exoplanets [via gravitational microlensing], probing dark energy, and surveying the near infrared sky. Since the release of the Astro2010 Decadal Survey, the team has been working with the WFIRST Science Definition Team to refine mission and payload concepts. We present the current interim reference mission point design of the payload, based on the use of a 1.3m unobscured aperture three mirror anastigmat form, with focal imaging and slit-less spectroscopy science channels. We also present the first results of Structural/Thermal/Optical performance modeling of the telescope point design.

  20. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  1. 77 FR 71600 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-03

    ... in the Design and Development of a Survey Regarding Patient Experiences With Emergency Department... with CAHPS Survey Design Principles and implementation instructions will be based on those for CAHPS... to CMS ED_Survey@cms.hhs.gov or by postal mail at Centers for Medicare and Medicaid...

  2. 78 FR 5458 - Medicare Program; Request for Information To Aid in the Design and Development of a Survey...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... in the Design and Development of a Survey Regarding Patient and Family Member/Friend Experiences With... developed in accordance with CAHPS Survey Design Principles and implementation instructions will be based on... HospiceSurvey@cms.hhs.gov or by postal mail at Centers for Medicare and Medicaid Services,...

  3. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for detecting abrupt changes (such as failures) in stochastic dynamical systems are surveyed. The class of linear systems is concentrated on but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  4. HRMS sky survey wideband feed system design for DSS 24 beam waveguide antenna

    NASA Technical Reports Server (NTRS)

    Stanton, P. H.; Lee, P. R.; Reilly, H. F.

    1993-01-01

    The High-Resolution Microwave Survey (HRMS) Sky Survey project will be implemented on the DSS 24 beam waveguide (BWG) antenna over the frequency range of 2.86 to 10 GHz. Two wideband, ring-loaded, corrugated feed horns were designed to cover this range. The horns match the frequency-dependent gain requirements for the DSS 24 BWG system. The performance of the feed horns and the calculated system performance of DSS 24 are presented.

  5. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for the detection of abrupt changes (such as failures) in stochastic dynamical systems were surveyed. The class of linear systems were emphasized, but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  6. Statistical Algorithms for Designing Geophysical Surveys to Detect UXO Target Areas

    SciTech Connect

    O'Brien, Robert F.; Carlson, Deborah K.; Gilbert, Richard O.; Wilson, John E.; Bates, Derrick J.; Pulsipher, Brent A.

    2005-07-28

    The U.S. Department of Defense is in the process of assessing and remediating closed, transferred, and transferring military training ranges across the United States. Many of these sites have areas that are known to contain unexploded ordnance (UXO). Other sites or portions of sites are not expected to contain UXO, but some verification of this expectation using geophysical surveys is needed. Many sites are so large that it is often impractical and/or cost prohibitive to perform surveys over 100% of the site. In such cases, it is particularly important to be explicit about the performance required of the surveys. This article presents the statistical algorithms developed to support the design of geophysical surveys along transects (swaths) to find target areas (TAs) of anomalous geophysical readings that may indicate the presence of UXO. The algorithms described here determine (1) the spacing between transects that should be used for the surveys to achieve a specified probability of traversing the TA, (2) the probability of both traversing and detecting a TA of anomalous geophysical readings when the spatial density of anomalies within the TA is either uniform (unchanging over space) or has a bivariate normal distribution, and (3) the probability that a TA exists when it was not found by surveying along transects. These algorithms have been implemented in the Visual Sample Plan (VSP) software to develop cost-effective transect survey designs that meet performance objectives.

  7. Statistical Algorithms for Designing Geophysical Surveys to Detect UXO Target Areas

    SciTech Connect

    O'Brien, Robert F.; Carlson, Deborah K.; Gilbert, Richard O.; Wilson, John E.; Bates, Derrick J.; Pulsipher, Brent A.

    2005-07-29

    The U.S. Department of Defense is in the process of assessing and remediating closed, transferred, and transferring military training ranges across the United States. Many of these sites have areas that are known to contain unexploded ordnance (UXO). Other sites or portions of sites are not expected to contain UXO, but some verification of this expectation using geophysical surveys is needed. Many sites are so large that it is often impractical and/or cost prohibitive to perform surveys over 100% of the site. In that case, it is particularly important to be explicit about the performance required of the survey. This article presents the statistical algorithms developed to support the design of geophysical surveys along transects (swaths) to find target areas (TAs) of anomalous geophysical readings that may indicate the presence of UXO. The algorithms described here determine 1) the spacing between transects that should be used for the surveys to achieve a specified probability of traversing the TA, 2) the probability of both traversing and detecting a TA of anomalous geophysical readings when the spatial density of anomalies within the TA is either uniform (unchanging over space) or has a bivariate normal distribution, and 3) the probability that a TA exists when it was not found by surveying along transects. These algorithms have been implemented in the Visual Sample Plan (VSP) software to develop cost-effective transect survey designs that meet performance objectives.

  8. Review of quantitative surveys of the length and stability of MTBE, TBA, and benzene plumes in groundwater at UST sites.

    PubMed

    Connor, John A; Kamath, Roopa; Walker, Kenneth L; McHugh, Thomas E

    2015-01-01

    Quantitative information regarding the length and stability condition of groundwater plumes of benzene, methyl tert-butyl ether (MTBE), and tert-butyl alcohol (TBA) has been compiled from thousands of underground storage tank (UST) sites in the United States where gasoline fuel releases have occurred. This paper presents a review and summary of 13 published scientific surveys, of which 10 address benzene and/or MTBE plumes only, and 3 address benzene, MTBE, and TBA plumes. These data show the observed lengths of benzene and MTBE plumes to be relatively consistent among various regions and hydrogeologic settings, with median lengths at a delineation limit of 10 µg/L falling into relatively narrow ranges from 101 to 185 feet for benzene and 110 to 178 feet for MTBE. The observed statistical distributions of MTBE and benzene plumes show the two plume types to be of comparable lengths, with 90th percentile MTBE plume lengths moderately exceeding benzene plume lengths by 16% at a 10-µg/L delineation limit (400 feet vs. 345 feet) and 25% at a 5-µg/L delineation limit (530 feet vs. 425 feet). Stability analyses for benzene and MTBE plumes found 94 and 93% of these plumes, respectively, to be in a nonexpanding condition, and over 91% of individual monitoring wells to exhibit nonincreasing concentration trends. Three published studies addressing TBA found TBA plumes to be of comparable length to MTBE and benzene plumes, with 86% of wells in one study showing nonincreasing concentration trends. PMID:25040137

  9. Spatial coverage and inference: Trade-offs between survey design and model assumptions in the North American Breeding Bird Survey

    USGS Publications Warehouse

    Royle, J. Andrew; Sauer, J.R.

    2005-01-01

    Route selection in the North American Breeding Bird Survey is based on a quasi-stratified random sampling design motivated (in part) by the desire to achieve unbiased estimates of trends and other summaries of avian population status. In practice, some departure from design intentions is realized because active routes become concentrated around urban areas, and this yields unbalanced sampling with respect to habitat and land use patterns, and temporal changes in land use. The need to consider potential biases induced by factors not controlled for (or uncontrollable) by design has motivated the development of a model-based framework for conducting inference about population status and trend assessments from BBS data. The present modeling framework is sufficiently generic to allow consideration of designs that deviate from random sampling. Thus, for example, redundant information that results from clustering routes around urban areas, or targeted sampling to assess specific hypotheses (e.g., about the effect of land-use patterns on population status), can be viewed not as deficiencies in the design, but as features that necessitate extension of existing models used for assessment. In this paper, we consider whether the sampling design is relevant to conducting inference about population status and trends, and we provide a framework for addressing potential biases induced by an imbalance in spatial coverage of sampled routes.

  10. The Visible and Infrared Survey Telescope for Astronomy (VISTA): Design, technical overview, and performance

    NASA Astrophysics Data System (ADS)

    Sutherland, Will; Emerson, Jim; Dalton, Gavin; Atad-Ettedgui, Eli; Beard, Steven; Bennett, Richard; Bezawada, Naidu; Born, Andrew; Caldwell, Martin; Clark, Paul; Craig, Simon; Henry, David; Jeffers, Paul; Little, Bryan; McPherson, Alistair; Murray, John; Stewart, Malcolm; Stobie, Brian; Terrett, David; Ward, Kim; Whalley, Martin; Woodhouse, Guy

    2015-03-01

    The Visible and Infrared Survey Telescope for Astronomy (VISTA) is the 4-m wide-field survey telescope at ESO's Paranal Observatory, equipped with the world's largest near-infrared imaging camera (VISTA IR Camera, VIRCAM), with 1.65 degree diameter field of view, and 67 Mpixels giving 0.6 deg2 active pixel area, operating at wavelengths 0.8-2.3 μm. We provide a short history of the project, and an overview of the technical details of the full system including the optical design, mirrors, telescope structure, IR camera, active optics, enclosure and software. The system includes several innovative design features such as the f/1 primary mirror, thedichroic cold-baffle camera design and the sophisticated wavefront sensing system delivering closed-loop 5-axis alignment of the secondary mirror. We conclude with a summary of the delivered performance, and a short overview of the six ESO public surveys in progress on VISTA.

  11. [Development of a simple quantitative method for the strontium-89 concentration of radioactive liquid waste using the plastic scintillation survey meter for beta rays].

    PubMed

    Narita, Hiroto; Tsuchiya, Yuusuke; Hirase, Kiyoshi; Uchiyama, Mayuki; Fukushi, Masahiro

    2012-11-01

    Strontium-89 (89Sr: pure beta, E; 1.495 MeV-100%, halflife: 50.5 days) chloride is used as pain relief from bone metastases. An assay of 89Sr is difficult because of a pure beta emitter. For management of 89Sr, we tried to evaluate a simple quantitative method for the 59Sr concentration of radioactive liquid waste using scintillation survey meter for beta rays. The counting efficiency of the survey meter with this method was 35.95%. A simple 30 minutes measurement of 2 ml of the sample made the quantitative measurement of 89Sr practical. Reducing self-absorption of the beta ray in the solution by counting on the polyethlene paper improved the counting efficiency. Our method made it easy to manage the radioactive liquid waste under the legal restrictions. PMID:23402205

  12. Application of a Modified Universal Design Survey for Evaluation of Ares 1 Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for NASA's Ares 1 launch vehicle. Launch site ground operations include several operator tasks to prepare the vehicle for launch or to perform maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To support design evaluation, the Ares 1 Upper Stage (US) element Human Factors Engineering (HFE) group developed a survey based on the Universal Design approach. Universal Design is a process to create products that can be used effectively by as many people as possible. Universal Design per se is not a priority for Ares 1 because launch vehicle processing is a specialized skill and not akin to a consumer product that should be used by all people of all abilities. However, applying principles of Universal Design will increase the probability of an error free and efficient design which is a priority for Ares 1. The Design Quality Evaluation Survey centers on the following seven principles: (1) Equitable use, (2) Flexibility in use, (3) Simple and intuitive use, (4) Perceptible information, (5) Tolerance for error, (6) Low physical effort, (7) Size and space for approach and use. Each principle is associated with multiple evaluation criteria which were rated with the degree to which the statement is true. All statements are phrased in the utmost positive, or the design goal so that the degree to which judgments tend toward "completely agree" directly reflects the degree to which the design is good. The Design Quality Evaluation Survey was employed for several US analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability

  13. Musculoskeletal impairment survey in Rwanda: Design of survey tool, survey methodology, and results of the pilot study (a cross sectional survey)

    PubMed Central

    Atijosan, Oluwarantimi; Kuper, Hannah; Rischewski, Dorothea; Simms, Victoria; Lavy, Christopher

    2007-01-01

    Background Musculoskeletal impairment (MSI) is an important cause of morbidity and mortality worldwide, especially in developing countries. Prevalence studies for MSI in the developing world have used varying methodologies and are seldom directly comparable. This study aimed to develop a new tool to screen for and diagnose MSI and to pilot test the methodology for a national survey in Rwanda. Methods A 7 question screening tool to identify cases of MSI was developed through literature review and discussions with healthcare professionals. To validate the tool, trained rehabilitation technicians screened 93 previously identified gold standard 'cases' and 86 'non cases'. Sensitivity, specificity and positive predictive value were calculated. A standardised examination protocol was developed to determine the aetiology and diagnosis of MSI for those who fail the screening test. For the national survey in Rwanda, multistage cluster random sampling, with probability proportional to size procedures will be used for selection of a cross-sectional, nationally representative sample of the population. Households to be surveyed will be chosen through compact segment sampling and all individuals within chosen households will be screened. A pilot survey of 680 individuals was conducted using the protocol. Results: The screening tool demonstrated 99% sensitivity and 97% specificity for MSI, and a positive predictive value of 98%. During the pilot study 468 out of 680 eligible subjects (69%) were screened. 45 diagnoses were identified in 38 persons who were cases of MSI. The subjects were grouped into categories based on diagnostic subgroups of congenital (1), traumatic (17), infective (2) neurological (6) and other acquired(19). They were also separated into mild (42.1%), moderate (42.1%) and severe (15.8%) cases, using an operational definition derived from the World Health Organisation's International Classification of Functioning, Disability and Health. Conclusion: The screening tool had good sensitivity and specificity and was appropriate for use in a national survey. The pilot study showed that the survey protocol was appropriate for measuring the prevalence of MSI in Rwanda. This survey is an important step to building a sound epidemiological understanding of MSI, to enable appropriate health service planning. PMID:17391509

  14. Final report on the radiological surveys of designated DX firing sites at Los Alamos National Laboratory

    SciTech Connect

    1996-09-09

    CHEMRAD was contracted by Los Alamos National Laboratory to perform USRADS{reg_sign} (UltraSonic Ranging And Data System) radiation scanning surveys at designated DX Sites at the Los Alamos National Laboratory. The primary purpose of these scanning surveys was to identify the presence of Depleted Uranium (D-38) resulting from activities at the DX Firing Sites. This effort was conducted to update the most recent surveys of these areas. This current effort was initiated with site orientation on August 12, 1996. Surveys were completed in the field on September 4, 1996. This Executive Summary briefly presents the major findings of this work. The detail survey results are presented in the balance of this report and are organized by Technical Area and Site number in section 2. This organization is not in chronological order. USRADS and the related survey methods are described in section 3. Quality Control issues are addressed in section 4. Surveys were conducted with an array of radiation detectors either mounted on a backpack frame for man-carried use (Manual mode) or on a tricycle cart (RadCart mode). The array included radiation detectors for gamma and beta surface near surface contamination as well as dose rate at 1 meter above grade. The radiation detectors were interfaced directly to an USRADS 2100 Data Pack.

  15. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    PubMed Central

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students’ perceptions of three design features of biology lab courses: 1) collaboration, 2) discovery and relevance, and 3) iteration. We assessed the psychometric properties of the LCAS using established methods for instrument design and validation. We also assessed the ability of the LCAS to differentiate between CUREs and traditional laboratory courses, and found that the discovery and relevance and iteration scales differentiated between these groups. Our results indicate that the LCAS is suited for characterizing and comparing undergraduate biology lab courses and should be useful for determining the relative importance of the three design features for achieving student outcomes. PMID:26466990

  16. Optical design trade study for the Wide Field Infrared Survey Telescope [WFIRST

    NASA Astrophysics Data System (ADS)

    Content, D. A.; Goullioud, R.; Lehan, J. P.; Mentzell, J. E.

    2011-09-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission concept was ranked first in new space astrophysics mission by the Astro2010 Decadal Survey incorporating the Joint Dark Energy Mission (JDEM)-Omega payload concept and multiple science white papers. This mission is based on a space telescope at L2 studying exoplanets (via gravitational microlensing), probing dark energy, and surveying the near infrared sky. Since the release of NWNH, the WFIRST project has been working with the WFIRST science definition team (SDT) to refine mission and payload concepts. We present the driving requirements. The current interim reference mission point design, based on the use of a 1.3m unobscured aperture three mirror anastigmat form, with focal imaging and slitless spectroscopy science channels, is consistent with the requirements, requires no technology development, and out performs the JDEM-Omega design.

  17. Optical Design Trade Study for the Wide Field Infrared Survey Telescope [WFIRST

    NASA Technical Reports Server (NTRS)

    Content, David A.; Goullioud, R.; Lehan, John P.; Mentzell, John E.

    2011-01-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission concept was ranked first in new space astrophysics mission by the Astro2010 Decadal Survey incorporating the Joint Dark Energy Mission (JDEM)-Omega payload concept and multiple science white papers. This mission is based on a space telescope at L2 studying exoplanets [via gravitational microlensing], probing dark energy, and surveying the near infrared sky. Since the release of NWNH, the WFIRST project has been working with the WFIRST science definition team (SDT) to refine mission and payload concepts. We present the driving requirements. The current interim reference mission point design, based on the use of a 1.3m unobscured aperture three mirror anastigmat form, with focal imaging and slitless spectroscopy science channels, is consistent with the requirements, requires no technology development, and out performs the JDEM-Omega design.

  18. Wide Field Infrared Survey Telescope [WFIRST]: Telescope Design and Simulated Performance

    NASA Technical Reports Server (NTRS)

    Goullioud, R.; Content, D. A.; Kuan, G. M.; Moore, J. D.; Chang, Z.; Sunada, E. T.; Villalvazo, J.; Hawk, J. P.; Armani, N. V.; Johnson, E. L.; Powell, C. A.

    2012-01-01

    The ASTRO2010 Decadal Survey proposed multiple missions with NIR focal planes and 3 mirror wide field telescopes in the 1.5m aperture range. None of them would have won as standalone missions WFIRST is a combination of these missions, created by Astro 2010 committee. WFIRST Science Definition Team (SDT) tasked to examine the design. Project team is a GSFC-JPL-Caltech collaboration. This interim mission design is a result of combined work by the project team with the SDT.

  19. Project SAFE [Survey of Administrative Functional Efficiency]. A Feedback Project Designed to Assist Principals.

    ERIC Educational Resources Information Center

    1980

    After a brief explanation of Project SAFE (Survey of Administrative Functional Efficiency) as a system designed to provide necessary feedback to school principals, the author lists the components of the project: (1) a confidential report to the principal summarizing the results of administering the SAFE instrument in the school, (2) a profile of…

  20. USING GIS TO GENERATE SPATIALLY-BALANCED RANDOM SURVEY DESIGNS FOR NATURAL RESOURCE APPLICATIONS

    EPA Science Inventory

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sam...

  1. CONDITION ASSESSMENT FOR THE ESCAMBIA RIVER, FL, WATERSHED: BENTHIC MACROINVERTEBRATE SURVEYS USING A PROBABILISTIC SAMPLING DESIGN

    EPA Science Inventory

    Probabilistic sampling has been used to assess the condition of estuarine ecosystems, and the use of this survey design approach was examined for a northwest Florida watershed. Twenty-eight lotic sites within the Escambia River, Florida, watershed were randomly selected and visit...

  2. Writing History: Writing Assignment Design and Evaluation in Two American History Survey Courses.

    ERIC Educational Resources Information Center

    Adkison, Stephen; Woodworth-Ney, Laura; Hatzenbuehler, Ronald L.

    This interactive panel discussion paper by three educators focuses on the design and evaluation process of writing assignments in two American History survey courses at Idaho State University. The paper states that both approaches used--student-centered, and authentic writing to communicate--offer two different perspectives on using

  3. Designing cobalt chromium removable partial dentures for patients with shortened dental arches: a pilot survey.

    PubMed

    Nassani, M Z; Devlin, H; Tarakji, B; McCord, J F

    2011-08-01

    The aim of this survey was to investigate the quality of prescription for the fabrication of cobalt chromium removable partial dentures (RPDs) that are used to extend the shortened dental arches (SDAs). A survey of four commercial dental laboratories located in northern England was conducted. The target of this survey was cobalt chromium RPDs that were requested to restore SDAs comprising the anterior teeth and 2-4 premolars. Dentists' prescriptions were scrutinised, and a special data collection form was completed accordingly. A total of 94 dentists' prescriptions and associated SDA casts were examined. Almost all the requested cobalt chromium RPDs were clasp-retained RPDs (97%). Scrutinising the 91 prescriptions for clasp-retained cobalt chromium RPDs showed that dentists' prescriptions did not have any instructions about the design of the partial denture in a considerable proportion of the cases (32%). Teeth to be clasped were identified clearly in 45% of the prescriptions. A majority of the dentists (64%) failed to provide any instructions about the design of the rests to be placed on the most posterior premolar abutment teeth. A considerable proportion of the dentists delegated the task of selecting the type of the major connector to the dental technician (41%). Only 21 (23%) of the examined casts had clearly defined rest seat preparation. The outcome of this pilot survey shows inadequate quality of prescription in designing RPDs for patients with SDAs. This finding has an ethical and clinical bearing and does not fit with current legal guidelines relevant to designing RPDs. PMID:21175736

  4. ASSESSING THE ECOLOGICAL CONDITION OF A COASTAL PLAIN WATERSHED USING A PROBABILISTIC SURVEY DESIGN

    EPA Science Inventory

    Using a probabilistic survey design, we assessed the ecological condition of the Florida (USA) portion of the Escambia River watershed using selected environmental and benthic macroinvertebrate data. Macroinvertebrates were sampled at 28 sites during July-August 1996, and 3414 i...

  5. Designer cantilevers for even more accurate quantitative measurements of biological systems with multifrequency AFM

    NASA Astrophysics Data System (ADS)

    Contera, S.

    2016-04-01

    Multifrequency excitation/monitoring of cantilevers has made it possible both to achieve fast, relatively simple, nanometre-resolution quantitative mapping of mechanical of biological systems in solution using atomic force microscopy (AFM), and single molecule resolution detection by nanomechanical biosensors. A recent paper by Penedo et al [2015 Nanotechnology 26 485706] has made a significant contribution by developing simple methods to improve the signal to noise ratio in liquid environments, by selectively enhancing cantilever modes, which will lead to even more accurate quantitative measurements.

  6. Designer cantilevers for even more accurate quantitative measurements of biological systems with multifrequency AFM.

    PubMed

    Contera, S

    2016-04-01

    Multifrequency excitation/monitoring of cantilevers has made it possible both to achieve fast, relatively simple, nanometre-resolution quantitative mapping of mechanical of biological systems in solution using atomic force microscopy (AFM), and single molecule resolution detection by nanomechanical biosensors. A recent paper by Penedo et al [2015 Nanotechnology 26 485706] has made a significant contribution by developing simple methods to improve the signal to noise ratio in liquid environments, by selectively enhancing cantilever modes, which will lead to even more accurate quantitative measurements. PMID:26901640

  7. QuantPrime – a flexible tool for reliable high-throughput primer design for quantitative PCR

    PubMed Central

    Arvidsson, Samuel; Kwasniewski, Miroslaw; Riaño-Pachón, Diego Mauricio; Mueller-Roeber, Bernd

    2008-01-01

    Background Medium- to large-scale expression profiling using quantitative polymerase chain reaction (qPCR) assays are becoming increasingly important in genomics research. A major bottleneck in experiment preparation is the design of specific primer pairs, where researchers have to make several informed choices, often outside their area of expertise. Using currently available primer design tools, several interactive decisions have to be made, resulting in lengthy design processes with varying qualities of the assays. Results Here we present QuantPrime, an intuitive and user-friendly, fully automated tool for primer pair design in small- to large-scale qPCR analyses. QuantPrime can be used online through the internet or on a local computer after download; it offers design and specificity checking with highly customizable parameters and is ready to use with many publicly available transcriptomes of important higher eukaryotic model organisms and plant crops (currently 295 species in total), while benefiting from exon-intron border and alternative splice variant information in available genome annotations. Experimental results with the model plant Arabidopsis thaliana, the crop Hordeum vulgare and the model green alga Chlamydomonas reinhardtii show success rates of designed primer pairs exceeding 96%. Conclusion QuantPrime constitutes a flexible, fully automated web application for reliable primer design for use in larger qPCR experiments, as proven by experimental data. The flexible framework is also open for simple use in other quantification applications, such as hydrolyzation probe design for qPCR and oligonucleotide probe design for quantitative in situ hybridization. Future suggestions made by users can be easily implemented, thus allowing QuantPrime to be developed into a broad-range platform for the design of RNA expression assays. PMID:18976492

  8. Using Focus Groups To Design a Quantitative Measure: Women's Indirect "No" to Sexual Intimacy.

    ERIC Educational Resources Information Center

    Reeder, Heidi M.

    This study combined qualitative and quantitative methods to assess the reasons many women use indirect messages to say "no" to men's attempts to escalate sexual intimacy. Subjects were six female students at a large southwestern university. At one time, one group had four women, at another time the group had two women. All were Caucasian. The room…

  9. Lessons Learned in Interdisciplinary Professional Development Designed to Promote the Teaching of Quantitative Literacy

    ERIC Educational Resources Information Center

    Lardner, Emily; Bookman, Jack

    2013-01-01

    In this paper, we will describe the challenges and insights gained from conducting professional development workshops aimed at helping faculty prepare materials to support the development of students' quantitative skills in different disciplinary contexts. We will examine some of the mistakes we made, and misconceptions we had, in conducting…

  10. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  11. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    PubMed Central

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis. PMID:26125967

  12. Alcohol mixed with energy drinks: methodology and design of the Utrecht Student Survey

    PubMed Central

    de Haan, Lydia; de Haan, Hein A; Olivier, Berend; Verster, Joris C

    2012-01-01

    This paper describes the methodology of the Utrecht Student Survey. This online survey was conducted in June 2011 by 6002 students living in Utrecht, The Netherlands. The aim of the survey was to determine the potential impact of mixing alcoholic beverages with energy drinks on overall alcohol consumption and alcohol-related consequences. In contrast to most previous surveys conducted on this topic, the current survey used a more appropriate within-subject design, comparing the alcohol consumption of individuals who consume alcohol mixed with energy drinks on occasions. Specifically, a comparison was conducted to examine the occasions during which these individuals consume this mixture versus occasions during which they consume alcohol alone. In addition to energy drinks, the consumption of other non-alcoholic mixers was also assessed when combined with alcoholic beverages. Furthermore, the reasons for consuming energy drinks alone or in combination with alcohol were investigated, and were compared to reasons for mixing alcohol with other non-alcoholic beverages. Finally, personality characteristics and the level of risk-taking behavior among the individuals were also assessed to explore their relationship with alcohol consumption. The Utrecht Student Survey will be replicated in the USA, Australia, and the UK. Results will be pooled, but also examined for possible cross-cultural differences. PMID:23118547

  13. Design and synthesis of target-responsive aptamer-cross-linked hydrogel for visual quantitative detection of ochratoxin A.

    PubMed

    Liu, Rudi; Huang, Yishun; Ma, Yanli; Jia, Shasha; Gao, Mingxuan; Li, Jiuxing; Zhang, Huimin; Xu, Dunming; Wu, Min; Chen, Yan; Zhu, Zhi; Yang, Chaoyong

    2015-04-01

    A target-responsive aptamer-cross-linked hydrogel was designed and synthesized for portable and visual quantitative detection of the toxin Ochratoxin A (OTA), which occurs in food and beverages. The hydrogel network forms by hybridization between one designed DNA strand containing the OTA aptamer and two complementary DNA strands grafting on linear polyacrylamide chains. Upon the introduction of OTA, the aptamer binds with OTA, leading to the dissociation of the hydrogel, followed by release of the preloaded gold nanoparticles (AuNPs), which can be observed by the naked eye. To enable sensitive visual and quantitative detection, we encapsulated Au@Pt core-shell nanoparticles (Au@PtNPs) in the hydrogel to generate quantitative readout in a volumetric bar-chart chip (V-Chip). In the V-Chip, Au@PtNPs catalyzes the oxidation of H2O2 to generate O2, which induces movement of an ink bar to a concentration-dependent distance for visual quantitative readout. Furthermore, to improve the detection limit in complex real samples, we introduced an immunoaffinity column (IAC) of OTA to enrich OTA from beer. After the enrichment, as low as 1.27 nM (0.51 ppb) OTA can be detected by the V-Chip, which satisfies the test requirement (2.0 ppb) by the European Commission. The integration of a target-responsive hydrogel with portable enrichment by IAC, as well as signal amplification and quantitative readout by a simple microfluidic device, offers a new method for portable detection of food safety hazard toxin OTA. PMID:25771715

  14. Addressing statistical and operational challenges in designing large-scale stream condition surveys.

    PubMed

    Dobbie, Melissa J; Negus, Peter

    2013-09-01

    Implementing a statistically valid and practical monitoring design for large-scale stream condition monitoring and assessment programs can be difficult due to factors including the likely existence of a diversity of ecosystem types such as ephemeral streams over the sampling domain; limited resources to undertake detailed monitoring surveys and address knowledge gaps; and operational constraints on effective sampling at monitoring sites. In statistical speak, these issues translate to defining appropriate target populations and sampling units; designing appropriate spatial and temporal sample site selection methods; selection and use of appropriate indicators; and setting effect sizes with limited ecological and statistical information about the indicators of interest. We identify the statistical and operational challenges in designing large-scale stream condition surveys and discuss general approaches for addressing them. The ultimate aim in drawing attention to these challenges is to ensure operational practicality in carrying out future monitoring programs and that the resulting inferences about stream condition are statistically valid and relevant. PMID:23344628

  15. Design and test of postbuckled stiffened curved plates: A literature survey

    NASA Astrophysics Data System (ADS)

    Verolme, J. L.

    1993-02-01

    A designer's tool for compressive buckling of aircraft fuselage panels, currently being developed at the Structures and Materials Laboratory of the Faculty of Aerospace Engineering of Delft University of Technology, must be validated with experimental results. The tested materials will be either isotropic (metal), orthotropic (GLARE, a fiber-metal laminate), or anisotropic (fibre reinforced plastics). For the formulation of a test matrix, a literature survey concentrating on tests of flat and curved, stiffened and unstiffened plates is performed. At the same time, simple semi-empirical formulas are collected to construct a design procedure based on these formulas. The design procedure can be checked and validated with the results of the literature survey.

  16. The Design of a Novel Survey for Small Objects in the Solar System

    SciTech Connect

    Alcock, C.; Chen, W.P.; de Pater, I.; Lee, T.; Lissauer, J.; Rice, J.; Liang, C.; Cook, K.; Marshall, S.; Akerlof, C.

    2000-08-21

    We evaluated several concepts for a new survey for small objects in the Solar System. We designed a highly novel survey for comets in the outer region of the Solar System, which exploits the occultations of relatively bright stars to infer the presence of otherwise extremely faint objects. The populations and distributions of these objects are not known; the uncertainties span orders of magnitude! These objects are important scientifically as probes of the primordial solar system, and programmatically now that major investments may be made in the possible mitigation of the hazard of asteroid or comet collisions with the Earth.

  17. The Design of AN Interactive E-Learning Platform for Surveying Exercise

    NASA Astrophysics Data System (ADS)

    Cheng, S.-C.; Shih, P. T. Y.; Chang, S.-L.; Chen, G.-Y.

    2011-09-01

    Surveying exercise is a fundamental course for Civil Engineering students. This course is featured with field operation. This study explores the design of an e-learning platform for the surveying exercise course. The issues on organizing digital contents such as recorded video of the standard instrument operation, editing learning materials, and constructing the portfolio for the learning process, as well as generating learning motivation, are discussed. Noting the uploaded videos, publishing articles and commentaries, interactive examination sessions, assessing for each other, and mobile device accessing, are found to be useful elements for this platform.

  18. Estimation of wildlife population ratios incorporating survey design and visibility bias

    USGS Publications Warehouse

    Samuel, M.D.; Steinhorst, R.K.; Garton, E.O.; Unsworth, J.W.

    1992-01-01

    Age and sex ratio statistics are often a key component of the evaluation and management of wildlife populations. These statistics are determined from counts of animals that are commonly plagued by errors associated with either survey design or visibility bias. We present age and sex ratio estimators that incorporate both these sources of error and include the typical situation that animals are sampled in groups. Aerial surveys of elk (Cervus elaphus) in northcentral Idaho illustrate that differential visibility of age or sex classes can produce biased ratio estimates. Visibility models may be used to provide corrected estimates of ratios and their variability that incorporates errors due to sampling, visibility bias, and visibility estimation.

  19. Why we love or hate our cars: A qualitative approach to the development of a quantitative user experience survey.

    PubMed

    Tonetto, Leandro Miletto; Desmet, Pieter M A

    2016-09-01

    This paper presents a more ecologically valid way of developing theory-based item questionnaires for measuring user experience. In this novel approach, items were generated using natural and domain-specific language of the research population, what seems to have made the survey much more sensitive to real experiences than theory-based ones. The approach was applied in a survey that measured car experience. Ten in-depth interviews were conducted with drivers inside their cars. The resulting transcripts were analysed with the aim of capturing their natural utterances for expressing their car experience. This analysis resulted in 71 categories of answers. For each category, one sentence was selected to serve as a survey-item. In an online platform, 538 respondents answered the survey. Data reliability, tested with Cronbach alpha index, was 0.94, suggesting a survey with highly reliable results to measure drivers' appraisals of their cars. PMID:27184312

  20. Distance software: design and analysis of distance sampling surveys for estimating population size

    PubMed Central

    Thomas, Len; Buckland, Stephen T; Rexstad, Eric A; Laake, Jeff L; Strindberg, Samantha; Hedley, Sharon L; Bishop, Jon RB; Marques, Tiago A; Burnham, Kenneth P

    2010-01-01

    1.Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2.We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3.Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4.A first step in analysis of distance sampling data is modelling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5.All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6.Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modelling analysis engine for spatial and habitat modelling, and information about accessing the analysis engines directly from other software. 7.Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of-the-art software that implements these methods is described that makes the methods accessible to practising ecologists. PMID:20383262

  1. The inclusion of open-ended questions on quantitative surveys of children: Dealing with unanticipated responses relating to child abuse and neglect.

    PubMed

    Lloyd, Katrina; Devine, Paula

    2015-10-01

    Web surveys have been shown to be a viable, and relatively inexpensive, method of data collection with children. For this reason, the Kids' Life and Times (KLT) was developed as an annual online survey of 10 and 11 year old children. Each year, approximately 4,000 children participate in the survey. Throughout the six years that KLT has been running, a range of questions has been asked that are both policy-relevant and important to the lives of children. Given the method employed by the survey, no extremely sensitive questions that might cause the children distress are included. The majority of questions on KLT are closed yielding quantitative data that are analysed statistically; however, one regular open-ended question is included at the end of KLT each year so that the children can suggest questions that they think should be asked on the survey the following year. While most of the responses are innocuous, each year a small minority of children suggest questions on child abuse and neglect. This paper reports the responses to this question and reflects on how researchers can, and should, deal with this issue from both a methodological and an ethical perspective. PMID:25952476

  2. Composite Interval Mapping Based on Lattice Design for Error Control May Increase Power of Quantitative Trait Locus Detection

    PubMed Central

    Huang, Zhongwen; Zhao, Tuanjie; Xing, Guangnan; Gai, Junyi; Guan, Rongzhan

    2015-01-01

    Experimental error control is very important in quantitative trait locus (QTL) mapping. Although numerous statistical methods have been developed for QTL mapping, a QTL detection model based on an appropriate experimental design that emphasizes error control has not been developed. Lattice design is very suitable for experiments with large sample sizes, which is usually required for accurate mapping of quantitative traits. However, the lack of a QTL mapping method based on lattice design dictates that the arithmetic mean or adjusted mean of each line of observations in the lattice design had to be used as a response variable, resulting in low QTL detection power. As an improvement, we developed a QTL mapping method termed composite interval mapping based on lattice design (CIMLD). In the lattice design, experimental errors are decomposed into random errors and block-within-replication errors. Four levels of block-within-replication errors were simulated to show the power of QTL detection under different error controls. The simulation results showed that the arithmetic mean method, which is equivalent to a method under random complete block design (RCBD), was very sensitive to the size of the block variance and with the increase of block variance, the power of QTL detection decreased from 51.3% to 9.4%. In contrast to the RCBD method, the power of CIMLD and the adjusted mean method did not change for different block variances. The CIMLD method showed 1.2- to 7.6-fold higher power of QTL detection than the arithmetic or adjusted mean methods. Our proposed method was applied to real soybean (Glycine max) data as an example and 10 QTLs for biomass were identified that explained 65.87% of the phenotypic variation, while only three and two QTLs were identified by arithmetic and adjusted mean methods, respectively. PMID:26076140

  3. Evaluation of the Total Design Method in a survey of Japanese dentists

    PubMed Central

    Nakai, Yukie; Milgrom, Peter; Yoshida, Toshiko; Ishihara, Chikako; Shimono, Tsutomu

    2005-01-01

    Background This study assessed the application of the Total Design Method (TDM) in a mail survey of Japanese dentists. The TDM was chosen because survey response rates in Japan are unacceptably low and the TDM had previously been used in a general population survey. Methods Four hundred and seventy eight dentist members of the Okayama Medical and Dental Practitioner's Association were surveyed. The nine-page, 27-item questionnaire covered dentist job satisfaction, physical practice, and dentist and patient characteristics. Respondents to the first mailing or the one-week follow-up postcard were defined as early responders; others who responded were late responders. Responder bias was assessed by examining age, gender and training. Results The overall response rate was 46.7% (223/478). The response rates by follow-up mailing were, 18% after the first mailing, 35.4% after the follow-up postcard, 42.3% after the second mailing, and 46.7% after the third mailing. Respondents did not differ from non-respondents in age or gender, nor were there differences between early and late responders. Conclusion The application of TDM in this survey of Japanese dentists produced lower rates of response than expected from previous Japanese and US studies. PMID:16115323

  4. Hepatitis C Virus RNA Real-Time Quantitative RT-PCR Method Based on a New Primer Design Strategy.

    PubMed

    Chen, Lida; Li, Wenli; Zhang, Kuo; Zhang, Rui; Lu, Tian; Hao, Mingju; Jia, Tingting; Sun, Yu; Lin, Guigao; Wang, Lunan; Li, Jinming

    2016-01-01

    Viral nucleic acids are unstable when improperly collected, handled, and stored, resulting in decreased sensitivity of currently available commercial quantitative nucleic acid testing kits. Using known unstable hepatitis C virus RNA, we developed a quantitative RT-PCR method based on a new primer design strategy to reduce the impact of nucleic acid instability on nucleic acid testing. The performance of the method was evaluated for linearity, limit of detection, precision, specificity, and agreement with commercial hepatitis C virus assays. Its clinical application was compared to that of two commercial kits--Cobas AmpliPrep/Cobas TaqMan (CAP/CTM) and Kehua. The quantitative RT-PCR method delivered a good performance, with a linearity of R(2) = 0.99, a total limit of detection (genotypes 1 to 6) of 42.6 IU/mL (95% CI, 32.84 to 67.76 IU/mL), a CV of 1.06% to 3.34%, a specificity of 100%, and a high concordance with the CAP/CTM assay (R(2) = 0.97), with a means ± SD value of -0.06 ± 1.96 log IU/mL (range, -0.38 to 0.25 log IU/mL). The method was superior to commercial assays in detecting unstable hepatitis C virus RNA (P < 0.05). This quantitative RT-PCR method can effectively eliminate the influence of RNA instability on nucleic acid testing. The principle of primer design strategy may be applied to the detection of other RNA or DNA viruses. PMID:26612712

  5. A survey of scientific literacy to provide a foundation for designing science communication in Japan.

    PubMed

    Kawamoto, Shishin; Nakayama, Minoru; Saijo, Miki

    2013-08-01

    There are various definitions and survey methods for scientific literacy. Taking into consideration the contemporary significance of scientific literacy, we have defined it with an emphasis on its social aspects. To acquire the insights needed to design a form of science communication that will enhance the scientific literacy of each individual, we conducted a large-scale random survey within Japan of individuals older than 18 years, using a printed questionnaire. The data thus acquired were analyzed using factor analysis and cluster analysis to create a 3-factor/4-cluster model of people's interest and attitude toward science, technology and society and their resulting tendencies. Differences were found among the four clusters in terms of the three factors: scientific factor, social factor, and science-appreciating factor. We propose a plan for designing a form of science communication that is appropriate to this current status of scientific literacy in Japan. PMID:23885051

  6. A Student-Designed Potentiometric Titration: Quantitative Determination of Iron(II) by Caro's Acid Titration

    NASA Astrophysics Data System (ADS)

    Powell, Joyce R.; Tucker, Sheryl A.; Acree, William E., Jr.; Sees, Jennifer A.; Hall, Lindsey H.

    1996-10-01

    A laboratory experiment involving the feasibility of using Caro's acid, H2SO5, as a titrant in the potentiometric determination of iron (II) is presented. Specific items considered in the experiment include (i) method of endpoint detection; (ii) shelf-life stability of titrant; (iii) accuracy, relative precision and experimental uncertainty; and (iv) which common cations and/or anions (if any) interfere with the titration. Typical student results gave iron (II) concentrations within 1-2 % of the "true" values. Of the 14 cation/anion pairs suggested in the paper, only time(II) was found to interfere with the quantitative determination of iron (II) using Caro's acid titrant.

  7. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    USGS Publications Warehouse

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  8. Robotic influence in the conceptual design of mechanical systems in space and vice versa - A survey

    NASA Technical Reports Server (NTRS)

    Sanger, George F.

    1988-01-01

    A survey of methods using robotic devices to construct structural elements in space is presented. Two approaches to robotic construction are considered: one in which the structural elements are designed using conventional aerospace techniques which tend to constrain the function aspects of robotics and one in which the structural elements are designed from the conceptual stage with built-in robotic features. Examples are presented of structural building concepts using robotics, including the construction of the SP-100 nuclear reactor power system, a multimirror large aperture IR space telescope concept, retrieval and repair in space, and the Flight Telerobotic Servicer.

  9. Radiologists' requirements for primary diagnosis workstations: preliminary results of task-based design surveys

    NASA Astrophysics Data System (ADS)

    Hohman, Suzan A.; Johnson, Sandra L.; Valentino, Daniel J.; Taira, Ricky K.; Manzo, William A.

    1994-05-01

    There has been a tremendous amount of effort put into the design of diagnostic radiology workstations; however, few workstations have been clinically accepted. Among the requirements for a clinically acceptable workstation are good image quality, a well designed user-interface, and access to all relevant diagnostic information. The user-interface design should reflect radiologist's film reading habits and encourage new reading methods that take advantage of the electronic environment. As part of our effort to improve diagnostic workstation design, we surveyed radiologists in the UCLA Department of Radiological Sciences. Sixteen radiologists from the fields of pediatric, genitourinary, thoracic, and neuroradiology participated in the initial survey. We asked their opinions regarding our PACS infrastructure performance and our existing diagnostic workstations. We also asked them to identify certain pathologies that they found to be less evident on workstations as compared to film. We are using this information to determine the current limitations of diagnostic workstations and to develop a user interface design that addresses the clinical requirements of a busy teritiary care medical center the radiologists who use it.

  10. Design and Implementation Issues in Surveying the Views of Young Children in Ethnolinguistically Diverse Developing Country Contexts

    ERIC Educational Resources Information Center

    Smith, Hilary A.; Haslett, Stephen J.

    2016-01-01

    This paper discusses issues in the development of a methodology appropriate for eliciting sound quantitative data from primary school children in the complex contexts of ethnolinguistically diverse developing countries. Although these issues often occur in field-based surveys, the large extent and compound effects of their occurrence in…

  11. Design considerations and quantitative assessment for the development of percutaneous mitral valve stent.

    PubMed

    Kumar, Gideon Praveen; Cui, Fangsen; Phang, Hui Qun; Su, Boyang; Leo, Hwa Liang; Hon, Jimmy Kim Fatt

    2014-07-01

    Percutaneous heart valve replacement is gaining popularity, as more positive reports of satisfactory early clinical experiences are published. However this technique is mostly used for the replacement of pulmonary and aortic valves and less often for the repair and replacement of atrioventricular valves mainly due to their anatomical complexity. While the challenges posed by the complexity of the mitral annulus anatomy cannot be mitigated, it is possible to design mitral stents that could offer good anchorage and support to the valve prosthesis. This paper describes four new Nitinol based mitral valve designs with specific features intended to address migration and paravalvular leaks associated with mitral valve designs. The paper also describes maximum possible crimpability assessment of these mitral stent designs using a crimpability index formulation based on the various stent design parameters. The actual crimpability of the designs was further evaluated using finite element analysis (FEA). Furthermore, fatigue modeling and analysis was also done on these designs. One of the models was then coated with polytetrafluoroethylene (PTFE) with leaflets sutured and put to: (i) leaflet functional tests to check for proper coaptation of the leaflet and regurgitation leakages on a phantom model and (ii) anchorage test where the stented valve was deployed in an explanted pig heart. Simulations results showed that all the stents designs could be crimped to 18F without mechanical failure. Leaflet functional test results showed that the valve leaflets in the fabricated stented valve coapted properly and the regurgitation leakage being within acceptable limits. Deployment of the stented valve in the explanted heart showed that it anchors well in the mitral annulus. Based on these promising results of the one design tested, the other stent models proposed here were also considered to be promising for percutaneous replacement of mitral valves for the treatment of mitral regurgitation, by virtue of their key features as well as effective crimping. These models will be fabricated and put to all the aforementioned tests before being taken for animal trials. PMID:24746106

  12. Multiwavelength CO2 differential-absorption lidar (DIAL) system designed for quantitative concentration measurement

    NASA Astrophysics Data System (ADS)

    Leonelli, Joseph; van der Laan, Jan; Holland, Peter; Fletcher, Leland; Warren, Russell

    1990-05-01

    A multiwavelength CO2 direct-detection DIAL system has been designed and developed to produce range-resolved vapor concentration contour plots of a 1 x 1 km grid at 20-m spatial resolution in 10 s intervals.

  13. A quantitative method for groundwater surveillance monitoring network design at the Hanford Site

    SciTech Connect

    Meyer, P.D.

    1993-12-01

    As part of the Environmental Surveillance Program at the Hanford Site, mandated by the US Department of Energy, hundreds of groundwater wells are sampled each year, with each sample typically analyzed for a variety of constituents. The groundwater sampling program must satisfy several broad objectives. These objectives include an integrated assessment of the condition of groundwater and the identification and quantification of existing, emerging, or potential groundwater problems. Several quantitative network desip objectives are proposed and a mathematical optimization model is developed from these objectives. The model attempts to find minimum cost network alternatives that maximize the amount of information generated by the network. Information is measured both by the rats of change with respect to time of the contaminant concentration and the uncertainty in contaminant concentration. In an application to tritium monitoring at the Hanford Site, both information measures were derived from historical data using time series analysis.

  14. Survey of the Quality of Experimental Design, Statistical Analysis and Reporting of Research Using Animals

    PubMed Central

    Kilkenny, Carol; Parsons, Nick; Kadyszewski, Ed; Festing, Michael F. W.; Cuthill, Innes C.; Fry, Derek; Hutton, Jane; Altman, Douglas G.

    2009-01-01

    For scientific, ethical and economic reasons, experiments involving animals should be appropriately designed, correctly analysed and transparently reported. This increases the scientific validity of the results, and maximises the knowledge gained from each experiment. A minimum amount of relevant information must be included in scientific publications to ensure that the methods and results of a study can be reviewed, analysed and repeated. Omitting essential information can raise scientific and ethical concerns. We report the findings of a systematic survey of reporting, experimental design and statistical analysis in published biomedical research using laboratory animals. Medline and EMBASE were searched for studies reporting research on live rats, mice and non-human primates carried out in UK and US publicly funded research establishments. Detailed information was collected from 271 publications, about the objective or hypothesis of the study, the number, sex, age and/or weight of animals used, and experimental and statistical methods. Only 59% of the studies stated the hypothesis or objective of the study and the number and characteristics of the animals used. Appropriate and efficient experimental design is a critical component of high-quality science. Most of the papers surveyed did not use randomisation (87%) or blinding (86%), to reduce bias in animal selection and outcome assessment. Only 70% of the publications that used statistical methods described their methods and presented the results with a measure of error or variability. This survey has identified a number of issues that need to be addressed in order to improve experimental design and reporting in publications describing research using animals. Scientific publication is a powerful and important source of information; the authors of scientific publications therefore have a responsibility to describe their methods and results comprehensively, accurately and transparently, and peer reviewers and journal editors share the responsibility to ensure that published studies fulfil these criteria. PMID:19956596

  15. Loop Shaping Control Design for a Supersonic Propulsion System Model Using Quantitative Feedback Theory (QFT) Specifications and Bounds

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Kopasakis, George

    2010-01-01

    This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.

  16. Indoor firing range air quality: results of a facility design survey.

    PubMed

    Schaeffer, D J; Deem, R A; Novak, E W

    1990-02-01

    A survey of 611 indoor firing ranges identified features of range design, operation, and maintenance that could affect air quality, a concern because of the lead composition of ammunition fired at the ranges. Features examined included the number of firing positions, location and type of air-handling equipment, maintenance practices, and other standard operating procedures (SOPs). Analysis of the data from the 339 valid responses showed that these ranges vary widely in design, construction, number of firing positions, and frequency of use. Most of these ranges were constructed years ago to the standards in force at the time. Consequently, they do not include many features specified in current standards. Findings from the survey suggest two possible options for ranges concerned about the lead exposure level: design upgrades and SOP changes. Retrofits are costly and design solutions must rely on existing criteria, many of which need verification to ensure their adequacy. Also, more research is needed to define the relationships between ventilation system design and lead exposures at the firing line. A lower cost, more expedient solution is to establish a program of prevention through changes in SOP, such as prohibition of unjacketed lead bullets or establishment of a regular program to monitor the proper operation of ventilation systems. Technology to enhance these preventive measures is being investigated. Possible products include devices for real-time monitoring of ambient air quality and personal lead dosage monitors. This study has underscored the need for further site studies to verify design criteria and to collect chemical and physical data that will help define the nature and extent of problems.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:2305677

  17. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    PubMed

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from <40% (frogs and toads, snakes) to >60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species. PMID:26232568

  18. Designing, Testing, and Validating an Attitudinal Survey on an Environmental Topic: A Groundwater Pollution Survey Instrument for Secondary School Students

    ERIC Educational Resources Information Center

    Lacosta-Gabari, Idoya; Fernandez-Manzanal, Rosario; Sanchez-Gonzalez, Dolores

    2009-01-01

    Research in environmental attitudes' assessment has significantly increased in recent years. The development of specific attitude scales for specific environmental problems has often been proposed. This paper describes the Groundwater Pollution Test (GPT), a 19-item survey instrument using a Likert-type scale. The survey has been used with…

  19. Designing, Testing, and Validating an Attitudinal Survey on an Environmental Topic: A Groundwater Pollution Survey Instrument for Secondary School Students

    ERIC Educational Resources Information Center

    Lacosta-Gabari, Idoya; Fernandez-Manzanal, Rosario; Sanchez-Gonzalez, Dolores

    2009-01-01

    Research in environmental attitudes' assessment has significantly increased in recent years. The development of specific attitude scales for specific environmental problems has often been proposed. This paper describes the Groundwater Pollution Test (GPT), a 19-item survey instrument using a Likert-type scale. The survey has been used with

  20. Campsite survey implications for managing designated campsites at Great Smoky Mountains National Park

    USGS Publications Warehouse

    Marion, J.L.; Leung, Y.-F.

    1998-01-01

    Backcountry campsites and shelters in Great Smoky Mountains National Park were surveyed in 1993 as part of a new impact monitoring program. A total of 395 campsites and shelters were located and assessed, including 309 legal campsites located at 84 designated campgrounds, 68 illegal campsites, and 18 shelters. Primary campsite management problems identified by the survey include: (1) campsite proliferation, (2) campsite expansion and excessive size, (3) excessive vegetation loss and soil exposure, (4) lack of visitor solitude at campsites, (5) excessive tree damage, and (6) illegal camping. A number of potential management options are recommended to address the identified campsite management problems. Many problems are linked to the ability of visitors to determine the location and number of individual campsites within each designated campground. A principal recommendation is that managers apply site-selection criteria to existing and potential new campsite locations to identify and designate campsites that will resist and constrain the areal extent of impacts and enhance visitor solitude. Educational solutions are also offered.

  1. Improving the design of acoustic and midwater trawl surveys through stratification, with an application to Lake Michigan prey fishes

    USGS Publications Warehouse

    Adams, J.V.; Argyle, R.L.; Fleischer, G.W.; Curtis, G.L.; Stickel, R.G.

    2006-01-01

    Reliable estimates of fish biomass are vital to the management of aquatic ecosystems and their associated fisheries. Acoustic and midwater trawl surveys are an efficient sampling method for estimating fish biomass in large bodies of water. To improve the precision of biomass estimates from combined acoustic and midwater trawl surveys, sampling effort should be optimally allocated within each stage of the survey design. Based on information collected during fish surveys, we developed an approach to improve the design of combined acoustic and midwater trawl surveys through stratification. Geographic strata for acoustic surveying and depth strata for midwater trawling were defined using neighbor-restricted cluster analysis, and the optimal allocation of sampling effort for each was then determined. As an example, we applied this survey stratification approach to data from lakewide acoustic and midwater trawl surveys of Lake Michigan prey fishes. Precision of biomass estimates from surveys with and without geographic stratification was compared through resampling. Use of geographic stratification with optimal sampling allocation reduced the variance of Lake Michigan acoustic biomass estimates by 77%. Stratification and optimal allocation at each stage of an acoustic and midwater trawl survey should serve to reduce the variance of the resulting biomass estimates.

  2. Cigarette pack design and adolescent smoking susceptibility: a cross-sectional survey

    PubMed Central

    Ford, Allison; MacKintosh, Anne Marie; Moodie, Crawford; Richardson, Sol; Hastings, Gerard

    2013-01-01

    Objectives To compare adolescents’ responses to three different styles of cigarette packaging: novelty (branded packs designed with a distinctive shape, opening style or bright colour), regular (branded pack with no special design features) and plain (brown pack with a standard shape and opening and all branding removed, aside from brand name). Design Cross-sectional in-home survey. Setting UK. Participants Random location quota sample of 1025 never smokers aged 11–16 years. Main outcome measures Susceptibility to smoking and composite measures of pack appraisal and pack receptivity derived from 11 survey items. Results Mean responses to the three pack types were negative for all survey items. However, ‘novelty’ packs were rated significantly less negatively than the ‘regular’ pack on most items, and the novelty and regular packs were rated less negatively than the ‘plain’ pack. For the novelty packs, logistic regressions, controlling for factors known to influence youth smoking, showed that susceptibility was associated with positive appraisal and also receptivity. For example, those receptive to the innovative Silk Cut Superslims pack were more than four times as likely to be susceptible to smoking than those not receptive to this pack (AOR=4.42, 95% CI 2.50 to 7.81, p<0.001). For the regular pack, an association was found between positive appraisal and susceptibility but not with receptivity and susceptibility. There was no association with pack appraisal or receptivity for the plain pack. Conclusions Pack structure (shape and opening style) and colour are independently associated, not just with appreciation of and receptivity to the pack, but also with susceptibility to smoke. In other words, those who think most highly of novelty cigarette packaging are also the ones who indicate that they are most likely to go on to smoke. Plain packaging, in contrast, was found to directly reduce the appeal of smoking to adolescents. PMID:24056481

  3. Decision making preferences in the medical encounter – a factorial survey design

    PubMed Central

    Müller-Engelmann, Meike; Krones, Tanja; Keller, Heidi; Donner-Banzhoff, Norbert

    2008-01-01

    Background Up to now it has not been systematically investigated in which kind of clinical situations a consultation style based on shared decision making (SDM) is preferred by patients and physicians. We suggest the factorial survey design to address this problem. This method, which so far has hardly been used in health service research, allows to vary relevant factors describing clinical situations as variables systematically in an experimental random design and to investigate their importance in large samples. Methods/Design To identify situational factors for the survey we first performed a literature search which was followed by a qualitative interview study with patients, physicians and health care experts. As a result, 7 factors (e.g. "Reason for consultation" and "Number of therapeutic options") with 2 to 3 levels (e.g. "One therapeutic option" and "More than one therapeutic option") will be included in the study. For the survey the factor levels will be randomly combined to short stories describing different treatment situations. A randomized sample of all possible short stories will be given to at least 300 subjects (100 GPs, 100 patients and 100 members of self-help groups) who will be asked to rate how the decision should be made. Main outcome measure is the preference for participation in the decision making process in the given clinical situation. Data analysis will estimate the effects of the factors on the rating and also examine differences between groups. Discussion The results will reveal the effects of situational variations on participation preferences. Thus, our findings will contribute to the understanding of normative values in the medical decision making process and will improve future implementation of SDM and decision aids. PMID:19091091

  4. DESIGN AND APPLICATION OF A STRATIFIED UNEQUAL-PROBABILITY STREAM SURVEY IN THE MID-ATLANTIC COASTAL PLAIN

    EPA Science Inventory

    A stratified random sample with unequal probability selection within strata was used to design a multipurpose survey of headwater watersheds in the Mid-Atlantic Coastal Plain. Objectives for data from the survey include unbiased estimates of regional headwater watershed condition...

  5. The Proteome of Human Liver Peroxisomes: Identification of Five New Peroxisomal Constituents by a Label-Free Quantitative Proteomics Survey

    PubMed Central

    Ofman, Rob; Bunse, Christian; Pawlas, Magdalena; Hayen, Heiko; Eisenacher, Martin; Stephan, Christian; Meyer, Helmut E.; Waterham, Hans R.; Erdmann, Ralf; Wanders, Ronald J.; Warscheid, Bettina

    2013-01-01

    The peroxisome is a key organelle of low abundance that fulfils various functions essential for human cell metabolism. Severe genetic diseases in humans are caused by defects in peroxisome biogenesis or deficiencies in the function of single peroxisomal proteins. To improve our knowledge of this important cellular structure, we studied for the first time human liver peroxisomes by quantitative proteomics. Peroxisomes were isolated by differential and Nycodenz density gradient centrifugation. A label-free quantitative study of 314 proteins across the density gradient was accomplished using high resolution mass spectrometry. By pairing statistical data evaluation, cDNA cloning and in vivo colocalization studies, we report the association of five new proteins with human liver peroxisomes. Among these, isochorismatase domain containing 1 protein points to the existence of a new metabolic pathway and hydroxysteroid dehydrogenase like 2 protein is likely involved in the transport or ?-oxidation of fatty acids in human peroxisomes. The detection of alcohol dehydrogenase 1A suggests the presence of an alternative alcohol-oxidizing system in hepatic peroxisomes. In addition, lactate dehydrogenase A and malate dehydrogenase 1 partially associate with human liver peroxisomes and enzyme activity profiles support the idea that NAD+ becomes regenerated during fatty acid ?-oxidation by alternative shuttling processes in human peroxisomes involving lactate dehydrogenase and/or malate dehydrogenase. Taken together, our data represent a valuable resource for future studies of peroxisome biochemistry that will advance research of human peroxisomes in health and disease. PMID:23460848

  6. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data

    PubMed Central

    Lewis, Jesse S.; Gerber, Brian D.

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km2 of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10–120 cameras) and occasions (20–120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with low detection (i.e., bobcat and coyote) the most efficient sampling approach was to increase the number of occasions (survey days). However, for common species that are moderately detectable (i.e., cottontail rabbit and mule deer), occupancy could reliably be estimated with comparatively low numbers of cameras over a short sampling period. We provide general guidelines for reliably estimating occupancy across a range of terrestrial species (rare to common: ψ = 0.175–0.970, and low to moderate detectability: p = 0.003–0.200) using motion-activated cameras. Wildlife researchers/managers with limited knowledge of the relative abundance and likelihood of detection of a particular species can apply these guidelines regardless of location. We emphasize the importance of prior biological knowledge, defined objectives and detailed planning (e.g., simulating different study-design scenarios) for designing effective monitoring programs and research studies. PMID:25210658

  7. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data.

    PubMed

    Shannon, Graeme; Lewis, Jesse S; Gerber, Brian D

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km(2) of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10-120 cameras) and occasions (20-120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with low detection (i.e., bobcat and coyote) the most efficient sampling approach was to increase the number of occasions (survey days). However, for common species that are moderately detectable (i.e., cottontail rabbit and mule deer), occupancy could reliably be estimated with comparatively low numbers of cameras over a short sampling period. We provide general guidelines for reliably estimating occupancy across a range of terrestrial species (rare to common: ψ = 0.175-0.970, and low to moderate detectability: p = 0.003-0.200) using motion-activated cameras. Wildlife researchers/managers with limited knowledge of the relative abundance and likelihood of detection of a particular species can apply these guidelines regardless of location. We emphasize the importance of prior biological knowledge, defined objectives and detailed planning (e.g., simulating different study-design scenarios) for designing effective monitoring programs and research studies. PMID:25210658

  8. Requirements and concept design for large earth survey telescope for SEOS

    NASA Technical Reports Server (NTRS)

    Mailhot, P.; Bisbee, J.

    1975-01-01

    The efforts of a one year program of Requirements Analysis and Conceptual Design for the Large Earth Survey Telescope for the Synchronous Earth Observatory Satellite is summarized. A 1.4 meter aperture Cassegrain telescope with 0.6 deg field of view is shown to do an excellent job in satisfying the observational requirements for a wide range of earth resources and meteorological applications. The telescope provides imagery or thermal mapping in ten spectral bands at one time in a field sharing grouping of linear detector arrays. Pushbroom scanning is accomplished by spacecraft slew.

  9. Questionnaire survey of customer satisfaction for product categories towards certification of ergonomic quality in design.

    PubMed

    Mochimaru, Masaaki; Takahashi, Miwako; Hatakenaka, Nobuko; Horiuchi, Hitoshi

    2012-01-01

    Customer satisfaction was surveyed for 6 product categories (consumer electronics, daily commodities, home equipment, information systems, cars, and health appliances) by questionnaires based on the Analytic Hierarchy Process. Analyzing weight of evaluation factors, the 6 product categories were reorganized into 4 categories, those were related to 4 aspects in daily living that formed by two axes: home living - mobility life and healthy life - active communication. It was found that consumers were attracted by the actual user test by public institutes for all product categories. The certification based on the design process standard established by authorities, such as EQUID was the second best attractor for consumers. PMID:22316844

  10. Flow bioreactor design for quantitative measurements over endothelial cells using micro-particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Leong, Chia Min; Voorhees, Abram; Nackman, Gary B.; Wei, Timothy

    2013-04-01

    Mechanotransduction in endothelial cells (ECs) is a highly complex process through which cells respond to changes in hemodynamic loading by generating biochemical signals involving gene and protein expression. To study the effects of mechanical loading on ECs in a controlled fashion, different in vitro devices have been designed to simulate or replicate various aspects of these physiological phenomena. This paper describes the design, use, and validation of a flow chamber which allows for spatially and temporally resolved micro-particle image velocimetry measurements of endothelial surface topography and stresses over living ECs immersed in pulsatile flow. This flow chamber also allows the study of co-cultures (i.e., ECs and smooth muscle cells) and the effect of different substrates (i.e., coverslip and/or polyethylene terepthalate (PET) membrane) on cellular response. In this report, the results of steady and pulsatile flow on fixed endothelial cells seeded on PET membrane and coverslip, respectively, are presented. Surface topography of ECs is computed from multiple two-dimensional flow measurements. The distributions of shear stress and wall pressure on each individual cell are also determined and the importance of both types of stress in cell remodeling is highlighted.

  11. A quantitative risk analysis tool for design/simulation of wastewater treatment plants.

    PubMed

    Bixio, D; Parmentier, G; Rousseau, D; Verdonck, F; Meirlaen, J; Vanrolleghem, P A; Thoeye, C

    2002-01-01

    Uncertainty is a central concept in the decision-making process, especially when dealing with biological systems subject to large natural variations. In the design of activated sludge systems, a conventional approach in dealing with uncertainty is implicitly translating it into above-normal safety factors, which in some cases may even increase the capital investments by an order of magnitude. To obviate this problem, an alternative design approach explicitly incorporating uncertainty is herein proposed. A probabilistic Monte Carlo engine is coupled to deterministic wastewater treatment plant (WWTP) models. The paper provides a description of the approach and a demonstration of the general adequacy of the method. The procedure is examined in an upgrade of a conventional WWTP towards stricter effluent standards on nutrients. The results suggest that the procedure can support the decision-making process under uncertainty conditions and that it can enhance the likelihood of meeting effluent standards without entailing above-normal capital investments. The analysis led to reducing the capital investment by 43%, producing savings of more than 1.2 million euro. PMID:12361025

  12. HIV testing during the Canadian immigration medical examination: a national survey of designated medical practitioners.

    PubMed

    Tran, Jennifer M; Li, Alan; Owino, Maureen; English, Ken; Mascarenhas, Lyndon; Tan, Darrell H S

    2014-01-01

    HIV testing is mandatory for individuals wishing to immigrate to Canada. Since the Designated Medical Practitioners (DMPs) who perform these tests may have varying experience in HIV and time constraints in their clinical practices, there may be variability in the quality of pre- and posttest counseling provided. We surveyed DMPs regarding HIV testing, counseling, and immigration inadmissibility. A 16-item survey was mailed to all DMPs across Canada (N = 203). The survey inquired about DMP characteristics, knowledge of HIV, attitudes and practices regarding inadmissibility and counseling, and interest in continuing medical education. There were a total of 83 respondents (41%). Participants frequently rated their knowledge of HIV diagnostics, cultural competency, and HIV/AIDS service organizations as "fair" (40%, 43%, and 44%, respectively). About 25%, 46%, and 11% of the respondents agreed/strongly agreed with the statements "HIV infected individuals pose a danger to public health and safety," "HIV-positive immigrants cause excessive demand on the healthcare system," and "HIV seropositivity is a reasonable ground for denial into Canada," respectively. Language was cited as a barrier to counseling, which focused on transmission risks (46% discussed this as "always" or "often") more than coping and social support (37%). There was a high level of interest (47%) in continuing medical education in this area. There are areas for improvement regarding DMPs' knowledge, attitudes, and practices about HIV infection, counseling, and immigration criteria. Continuing medical education and support for DMPs to facilitate practice changes could benefit newcomers who test positive through the immigration process. PMID:25029636

  13. A survey of two-stage focusing systems for nanobeam design

    NASA Astrophysics Data System (ADS)

    Merchant, M. J.; Grime, G. W.; Kirkby, K. J.; Webb, R.

    2007-07-01

    Since the construction of the first ion microprobe at Harwell in the early 1970s, there has been a steady improvement in the resolution of ion microprobes. However, in recent years the rate of improvement has slowed dramatically, with few systems able routinely to achieve resolutions less than 1 μm. There are many reasons for this lack of progress relating both to engineering at the nanometer scale and to fundamental physics. One crucial factor in the achievement of sub-micrometer resolution is the beam optical performance of the focusing system. This requires a high demagnification to permit the use of larger object apertures giving lower slit scattering, but this usually results in a correspondingly large aberration. We are investigating the use of two-stage lens systems with an intermediate focus. Such systems are capable of far greater demagnification than single-stage systems, but the challenge is to find a two-stage system with an acceptable ratio between demagnification and aberration. This paper presents preliminary the results of a systematic survey of two-stage lenses for nanobeam design. The scope of the survey is bounded by a number of practical limitations for nanobeam design. The survey encompasses systems of up to 8 quadrupole lenses and 4 independent power supplies arranged as two groups of up to four lenses constrained to form an intermediate image. The parameter space for this survey is vast, and even restricting the quadrupole lengths to those commercially available and to the 9 m beam length available in our laboratory, several million system geometries have been considered. A matrix-based beam optics software package has been developed which surveys the parameter space to determine the optimum value of a figure of merit based on the ratio of demagnification to spherical aberration. This uses the analytical approximations for spherical aberration in quadrupole lenses derived by Dymnikov et al. [A.D. Dymnikov, T.Ya. Fishkova, S.Ya. Yavor, Nucl. Instr. and Meth. 37 (1965) 268]. The performance of selected systems with good figures of merits have been further investigated using numerical raytracing software and the results are presented.

  14. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  15. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Technical Reports Server (NTRS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L; Guhathakurta, Puraga; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Wilmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Kirby, Evan N.; Lotz, Jennifer M.

    2013-01-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z approx. 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z approx. 1 via approx.90 nights of observation on the Keck telescope. The survey covers an area of 2.8 Sq. deg divided into four separate fields observed to a limiting apparent magnitude of R(sub AB) = 24.1. Objects with z approx. < 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted approx. 2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z approx. 1.45, where the [O ii] 3727 Ang. doublet lies in the infrared. The DEIMOS 1200 line mm(exp -1) grating used for the survey delivers high spectral resolution (R approx. 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z approx. 1, approaching approx. 5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z approx. 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far.

  16. THE DEEP2 GALAXY REDSHIFT SURVEY: DESIGN, OBSERVATIONS, DATA REDUCTION, AND REDSHIFTS

    SciTech Connect

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Harker, Justin J.; Lai, Kamson; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan Renbin; Kassin, Susan A.; Konidaris, N. P. E-mail: djm70@pitt.edu E-mail: mdavis@berkeley.edu E-mail: koo@ucolick.org E-mail: phillips@ucolick.org; and others

    2013-09-15

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z {approx} 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M{sub B} = -20 at z {approx} 1 via {approx}90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg{sup 2} divided into four separate fields observed to a limiting apparent magnitude of R{sub AB} = 24.1. Objects with z {approx}< 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted {approx}2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z {approx} 1.45, where the [O II] 3727 A doublet lies in the infrared. The DEIMOS 1200 line mm{sup -1} grating used for the survey delivers high spectral resolution (R {approx} 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z {approx} 1, approaching {approx}5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z {approx} 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far.

  17. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    NASA Astrophysics Data System (ADS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Connolly, A. J.; Kaiser, N.; Kirby, Evan N.; Lemaux, Brian C.; Lin, Lihwai; Lotz, Jennifer M.; Luppino, G. A.; Marinoni, C.; Matthews, Daniel J.; Metevier, Anne; Schiavon, Ricardo P.

    2013-09-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ~ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z ~ 1 via ~90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg2 divided into four separate fields observed to a limiting apparent magnitude of R AB = 24.1. Objects with z <~ 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted ~2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ~ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm-1 grating used for the survey delivers high spectral resolution (R ~ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z ~ 1, approaching ~5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z ~ 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far. Based on observations taken at the W. M. Keck Observatory, which is operated jointly by the University of California and the California Institute of Technology, and on observations made with the NASA/ESO Hubble Space Telescope, obtained from the data archives at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555, and from the Canadian Astronomy Data Centre.

  18. Quantitative evaluation of a thrust vector controlled transport at the conceptual design phase

    NASA Astrophysics Data System (ADS)

    Ricketts, Vincent Patrick

    The impetus to innovate, to push the bounds and break the molds of evolutionary design trends, often comes from competition but sometimes requires catalytic political legislature. For this research endeavor, the 'catalyzing legislation' comes in response to the rise in cost of fossil fuels and the request put forth by NASA on aircraft manufacturers to show reduced aircraft fuel consumption of +60% within 30 years. This necessitates that novel technologies be considered to achieve these values of improved performance. One such technology is thrust vector control (TVC). The beneficial characteristic of thrust vector control technology applied to the traditional tail-aft configuration (TAC) commercial transport is its ability to retain the operational advantage of this highly evolved aircraft type like cabin evacuation, ground operation, safety, and certification. This study explores if the TVC transport concept offers improved flight performance due to synergistically reducing the traditional empennage size, overall resulting in reduced weight and drag, and therefore reduced aircraft fuel consumption. In particular, this study explores if the TVC technology in combination with the reduced empennage methodology enables the TAC aircraft to synergistically evolve while complying with current safety and certification regulation. This research utilizes the multi-disciplinary parametric sizing software, AVD Sizing, developed by the Aerospace Vehicle Design (AVD) Laboratory. The sizing software is responsible for visualizing the total system solution space via parametric trades and is capable of determining if the TVC technology can enable the TAC aircraft to synergistically evolve, showing marked improvements in performance and cost. This study indicates that the TVC plus reduced empennage methodology shows marked improvements in performance and cost.

  19. Design and Evaluation of Digital Learning Material to Support Acquisition of Quantitative Problem-Solving Skills Within Food Chemistry

    NASA Astrophysics Data System (ADS)

    Diederen, Julia; Gruppen, Harry; Hartog, Rob; Voragen, Alphons G. J.

    2005-12-01

    One of the modules in the course Food Chemistry at Wageningen University (Wageningen, The Netherlands) focuses on quantitative problem-solving skills related to chemical reactions. The intended learning outcomes of this module are firstly, to be able to translate practical food chemistry related problems into mathematical equations and to solve them and secondly, to have a quantitative understanding of chemical reactions in food. Until 3 years ago the learning situation for this module was inefficient for both teachers and students. For this learning situation a staff/student ratio of 1/25 was experienced to be insufficient: the level of student frustration was high and many students could not finish the tasks within the scheduled time. To make this situation more efficient for both students and teachers and to lower the level of frustration, digital learning material was designed. The main characteristic of this learning material is that it provides just-in-time information, such as feedback, hints and links to background information. The material was evaluated in three case studies in a normal educational setting ( n = 22, n = 31, n = 33). The results show that now frustration of students is low, the time in classes is efficiently used, and the staff/student ratio of 1/25 is indeed sufficient. A staff student ratio of around 1/40 is now regarded as realistic.

  20. Schools and Staffing Survey, 1990-91: Sample Design and Estimation. Technical Report.

    ERIC Educational Resources Information Center

    Kaufman, Steven; Huang, Hertz

    The Schools and Staffing Survey (SASS) represents the union of three surveys by the National Center for Education Statistics (NCES), the Teacher Demand and Shortage Survey, the School and School Administrator Surveys, and the Teacher Survey. The SASS measures critical aspects of teaching supply and demand, the composition of the teacher and…

  1. A successful 3D seismic survey in the ``no-data zone,`` offshore Mississippi delta: Survey design and refraction static correction processing

    SciTech Connect

    Carvill, C.; Faris, N.; Chambers, R.

    1996-12-31

    This is a success story of survey design and refraction static correction processing of a large 3D seismic survey in the South Pass area of the Mississippi delta. In this transition zone, subaqueous mudflow gullies and lobes of the delta, in various states of consolidation and gas saturation, are strong absorbers of seismic energy. Seismic waves penetrating the mud are severely restricted in bandwidth and variously delayed by changes in mud velocity and thickness. Using a delay-time refraction static correction method, the authors find compensation for the various delays, i.e., static corrections, commonly vary 150 ms over a short distance. Application of the static corrections markedly improves the seismic stack volume. This paper shows that intelligent survey design and delay-time refraction static correction processing economically eliminate the historic no data status of this area.

  2. Biochip array technology immunoassay performance and quantitative confirmation of designer piperazines for urine workplace drug testing.

    PubMed

    Castaneto, Marisol S; Barnes, Allan J; Concheiro, Marta; Klette, Kevin L; Martin, Thomas A; Huestis, Marilyn A

    2015-06-01

    Designer piperazines are emerging novel psychoactive substances (NPS) with few high-throughput screening methods for their identification. We evaluated a biochip array technology (BAT) immunoassay for phenylpiperazines (PNP) and benzylpiperazines (BZP) and analyzed 20,017 randomly collected urine workplace specimens. Immunoassay performance at recommended cutoffs was evaluated for PNPI (5 μg/L), PNPII (7.5 μg/L), and BZP (5 μg/L) antibodies. Eight hundred forty positive and 206 randomly selected presumptive negative specimens were confirmed by liquid chromatography high-resolution mass spectrometry (LC-HRMS). Assay limits of detection for PNPI, PNPII, and BZP were 2.9, 6.3, and 2.1 μg/L, respectively. Calibration curves were linear (R (2) > 0.99) with upper limits of 42 μg/L for PNPI/PNII and 100 μg/L for BZP. Quality control samples demonstrated imprecision <19.3 %CV and accuracies 86.0-94.5 % of target. There were no interferences from 106 non-piperazine substances. Seventy-eight of 840 presumptive positive specimens (9.3 %) were LC-HRMS positive, with 72 positive for 1-(3-chlorophenyl)piperazine (mCPP), a designer piperazine and antidepressant trazodone metabolite. Of 206 presumptive negative specimens, one confirmed positive for mCPP (3.3 μg/L) and one for BZP (3.6 μg/L). BAT specificity (21.1 to 91.4 %) and efficiency (27.0 to 91.6 %) increased, and sensitivity slightly decreased (97.5 to 93.8 %) with optimized cutoffs of 25 μg/L PNPI, 42 μg/L PNPI, and 100 μg/L BZP. A high-throughput screening method is needed to identify piperazine NPS. We evaluated performance of the Randox BAT immunoassay to identify urinary piperazines and documented improved performance when antibody cutoffs were raised. In addition, in randomized workplace urine specimens, all but two positive specimens contained mCPP and/or trazodone, most likely from legitimate medical prescriptions. Graphical Abstract Biochip array technology (BAT) immunoassay for designer piperazines detection in urine. In chemiluminescent immunoassay, the labeled-drug (antigen) competes with the drug in the urine. In the absence of drug, the labeled-drug binds to the antibody releasing an enzyme (horseradish peroxidase) to react with the substrate and producing chemiluminescence. The higher the drug concentration in urine, the weaker the chemiluminescent signal is produced. All presumptive positive specimens and randomly selected presumptive negative specimens were analyzed and confirmed by a liquid chromatography high-resolution mass spectrometry with limit of quantification of 2.5 or 5 μg/L. PMID:25903022

  3. Dealing with Trade-Offs in Destructive Sampling Designs for Occupancy Surveys

    PubMed Central

    Canessa, Stefano; Heard, Geoffrey W.; Robertson, Peter; Sluiter, Ian R. K.

    2015-01-01

    Occupancy surveys should be designed to minimise false absences. This is commonly achieved by increasing replication or increasing the efficiency of surveys. In the case of destructive sampling designs, in which searches of individual microhabitats represent the repeat surveys, minimising false absences leads to an inherent trade-off. Surveyors can sample more low quality microhabitats, bearing the resultant financial costs and producing wider-spread impacts, or they can target high quality microhabitats were the focal species is more likely to be found and risk more severe impacts on local habitat quality. We show how this trade-off can be solved with a decision-theoretic approach, using the Millewa Skink Hemiergis millewae from southern Australia as a case study. Hemiergis millewae is an endangered reptile that is best detected using destructive sampling of grass hummocks. Within sites that were known to be occupied by H. millewae, logistic regression modelling revealed that lizards were more frequently detected in large hummocks. If this model is an accurate representation of the detection process, searching large hummocks is more efficient and requires less replication, but this strategy also entails destruction of the best microhabitats for the species. We developed an optimisation tool to calculate the minimum combination of the number and size of hummocks to search to achieve a given cumulative probability of detecting the species at a site, incorporating weights to reflect the sensitivity of the results to a surveyor’s priorities. The optimisation showed that placing high weight on minimising volume necessitates impractical replication, whereas placing high weight on minimising replication requires searching very large hummocks which are less common and may be vital for H. millewae. While destructive sampling methods are sometimes necessary, surveyors must be conscious of the ecological impacts of these methods. This study provides a simple tool for identifying sampling strategies that minimise those impacts. PMID:25760868

  4. Dealing with trade-offs in destructive sampling designs for occupancy surveys.

    PubMed

    Canessa, Stefano; Heard, Geoffrey W; Robertson, Peter; Sluiter, Ian R K

    2015-01-01

    Occupancy surveys should be designed to minimise false absences. This is commonly achieved by increasing replication or increasing the efficiency of surveys. In the case of destructive sampling designs, in which searches of individual microhabitats represent the repeat surveys, minimising false absences leads to an inherent trade-off. Surveyors can sample more low quality microhabitats, bearing the resultant financial costs and producing wider-spread impacts, or they can target high quality microhabitats were the focal species is more likely to be found and risk more severe impacts on local habitat quality. We show how this trade-off can be solved with a decision-theoretic approach, using the Millewa Skink Hemiergis millewae from southern Australia as a case study. Hemiergis millewae is an endangered reptile that is best detected using destructive sampling of grass hummocks. Within sites that were known to be occupied by H. millewae, logistic regression modelling revealed that lizards were more frequently detected in large hummocks. If this model is an accurate representation of the detection process, searching large hummocks is more efficient and requires less replication, but this strategy also entails destruction of the best microhabitats for the species. We developed an optimisation tool to calculate the minimum combination of the number and size of hummocks to search to achieve a given cumulative probability of detecting the species at a site, incorporating weights to reflect the sensitivity of the results to a surveyor's priorities. The optimisation showed that placing high weight on minimising volume necessitates impractical replication, whereas placing high weight on minimising replication requires searching very large hummocks which are less common and may be vital for H. millewae. While destructive sampling methods are sometimes necessary, surveyors must be conscious of the ecological impacts of these methods. This study provides a simple tool for identifying sampling strategies that minimise those impacts. PMID:25760868

  5. A Versatile Method to Design Stem-Loop Primer-Based Quantitative PCR Assays for Detecting Small Regulatory RNA Molecules

    PubMed Central

    Czimmerer, Zsolt; Hulvely, Julianna; Simandi, Zoltan; Varallyay, Eva; Havelda, Zoltan; Szabo, Erzsebet; Varga, Attila; Dezso, Balazs; Balogh, Maria; Horvath, Attila; Domokos, Balint; Torok, Zsolt; Nagy, Laszlo; Balint, Balint L.

    2013-01-01

    Short regulatory RNA-s have been identified as key regulators of gene expression in eukaryotes. They have been involved in the regulation of both physiological and pathological processes such as embryonal development, immunoregulation and cancer. One of their relevant characteristics is their high stability, which makes them excellent candidates for use as biomarkers. Their number is constantly increasing as next generation sequencing methods reveal more and more details of their synthesis. These novel findings aim for new detection methods for the individual short regulatory RNA-s in order to be able to confirm the primary data and characterize newly identified subtypes in different biological conditions. We have developed a flexible method to design RT-qPCR assays that are very sensitive and robust. The newly designed assays were tested extensively in samples from plant, mouse and even human formalin fixed paraffin embedded tissues. Moreover, we have shown that these assays are able to quantify endogenously generated shRNA molecules. The assay design method is freely available for anyone who wishes to use a robust and flexible system for the quantitative analysis of matured regulatory RNA-s. PMID:23383094

  6. Use of physiological constraints to identify quantitative design principles for gene expression in yeast adaptation to heat shock

    PubMed Central

    Vilaprinyo, Ester; Alves, Rui; Sorribas, Albert

    2006-01-01

    Background Understanding the relationship between gene expression changes, enzyme activity shifts, and the corresponding physiological adaptive response of organisms to environmental cues is crucial in explaining how cells cope with stress. For example, adaptation of yeast to heat shock involves a characteristic profile of changes to the expression levels of genes coding for enzymes of the glycolytic pathway and some of its branches. The experimental determination of changes in gene expression profiles provides a descriptive picture of the adaptive response to stress. However, it does not explain why a particular profile is selected for any given response. Results We used mathematical models and analysis of in silico gene expression profiles (GEPs) to understand how changes in gene expression correlate to an efficient response of yeast cells to heat shock. An exhaustive set of GEPs, matched with the corresponding set of enzyme activities, was simulated and analyzed. The effectiveness of each profile in the response to heat shock was evaluated according to relevant physiological and functional criteria. The small subset of GEPs that lead to effective physiological responses after heat shock was identified as the result of the tuning of several evolutionary criteria. The experimentally observed transcriptional changes in response to heat shock belong to this set and can be explained by quantitative design principles at the physiological level that ultimately constrain changes in gene expression. Conclusion Our theoretical approach suggests a method for understanding the combined effect of changes in the expression of multiple genes on the activity of metabolic pathways, and consequently on the adaptation of cellular metabolism to heat shock. This method identifies quantitative design principles that facilitate understating the response of the cell to stress. PMID:16584550

  7. A Survey to Examine Teachers' Perceptions of Design Dispositions, Lesson Design Practices, and Their Relationships with Technological Pedagogical Content Knowledge (TPACK)

    ERIC Educational Resources Information Center

    Koh, Joyce Hwee Ling; Chai, Ching Sing; Hong, Huang-Yao; Tsai, Chin-Chung

    2015-01-01

    This study investigates 201 Singaporean teachers' perceptions of their technological pedagogical content knowledge (TPACK), lesson design practices, and design dispositions through a survey instrument. Investigation of these constructs reveal important variables influencing teachers' perceptions of TPACK which have not yet been explored. The…

  8. A Survey to Examine Teachers' Perceptions of Design Dispositions, Lesson Design Practices, and Their Relationships with Technological Pedagogical Content Knowledge (TPACK)

    ERIC Educational Resources Information Center

    Koh, Joyce Hwee Ling; Chai, Ching Sing; Hong, Huang-Yao; Tsai, Chin-Chung

    2015-01-01

    This study investigates 201 Singaporean teachers' perceptions of their technological pedagogical content knowledge (TPACK), lesson design practices, and design dispositions through a survey instrument. Investigation of these constructs reveal important variables influencing teachers' perceptions of TPACK which have not yet been explored. The

  9. Automated Analysis of the Digitized Second Palomar Sky Survey: System Design, Implementation, and Initial Results

    NASA Astrophysics Data System (ADS)

    Weir, Nicholas

    1995-01-01

    We describe the design, implementation, and initial scientific results of a system for analyzing the Digitized Second Palomar Observatory Sky Survey (DPOSS). The system (SKICAT) facilitates and largely automates the pipeline processing of DPOSS from raw pixel data into calibrated, classified object catalog form. A fundamental constraint limiting the scientific usefulness of optical imaging surveys is the level at which objects may be reliably distinguished as stars, galaxies, or artifacts. The classifier implemented within SKICAT was created using a new machine learning technology, whereby an algorithm determines a near-optimal set of classification rules based upon training examples. Using this approach, we were able to construct a classifier which distinguishes objects to the same level of accuracy as in previous surveys using comparable plate material, but nearly one magnitude fainter (or an equivalent BJ ~ 21.0). Our first analysis of DPOSS using SKICAT is of an overlapping set of four survey fields near the North Galactic Pole, in both the J and F passbands. Through detailed simulations of a subset of these data, we were able to analyze systematic aspects of our detection and measurement procedures, as well as optimize them. We discuss how we calibrate the plate magnitudes to the Gunn-Thuan g and r photometric system using CCD sequences obtained in a program devoted expressly to calibrating DPOSS. Our technique results in an estimated plate-to-plate zero point standard error of under 0.10m in g and below 0.05^{m } in r, for J and F plates, respectively. Using the catalogs derived from these fields, we compare our differential galaxy counts in g and r with those from recent Schmidt plate surveys as well as predictions from evolutionary and non-evolutionary (NE) galaxy models. We find generally good agreement between our counts and recent NE and mild evolutionary models calibrated to consistently fit bright and faint galaxy counts, colors, and redshift distributions. The consistency of our results with these predictions provides additional support to the view that very recent (z < 0.1) or exotic galaxy evolution, or some non -standard forms of cosmology, may not be necessary to reconcile these diverse observations with theory.

  10. Patient-Physician Communication and Knowledge Regarding Fertility Issues from German Oncologists' Perspective-a Quantitative Survey.

    PubMed

    Buske, Dorit; Sender, Annekathrin; Richter, Diana; Brähler, Elmar; Geue, Kristina

    2016-03-01

    Many people diagnosed with haematologic malignancies are of child-bearing age. Typical treatment courses pose a high risk of infertility, and a lot of people affected by this are in the midst of starting or growing their families. Thus, it is crucial that they are well informed about fertility preservation options and can discuss these with an oncologist early on in the development of their treatment plans. Unfortunately, however, this does not always happen. One hundred twenty oncologists from 37 German adult clinical facilities were surveyed regarding their discussions with young patients about fertility, family planning, and fertility preservation. Almost all of them said that they consider fertility preservation to be an important issue. They also reported several factors as having an influence on the likelihood and practicability of discussing these subjects. Most knew about the existence of cryoconservation of germ cells and the use of GnRH analogues (95 %), but only half of them claimed to have a thorough understanding of these options. Many said they would like to learn more about this and that informational brochures could be helpful. Even though many oncologists do have good working knowledge of the subject, patients of reproductive age are not yet consistently given comprehensive information about the options available to them. To improve oncologists' knowledge of reproductive medicine, cooperation with fertility specialists should be facilitated, and informational leaflets should be made available both to patients and their medical care providers. PMID:25934223

  11. Large-Scale Survey Findings Inform Patients’ Experiences in Using Secure Messaging to Engage in Patient-Provider Communication and Self-Care Management: A Quantitative Assessment

    PubMed Central

    Patel, Nitin R; Lind, Jason D; Antinori, Nicole

    2015-01-01

    Background Secure email messaging is part of a national transformation initiative in the United States to promote new models of care that support enhanced patient-provider communication. To date, only a limited number of large-scale studies have evaluated users’ experiences in using secure email messaging. Objective To quantitatively assess veteran patients’ experiences in using secure email messaging in a large patient sample. Methods A cross-sectional mail-delivered paper-and-pencil survey study was conducted with a sample of respondents identified as registered for the Veteran Health Administrations’ Web-based patient portal (My HealtheVet) and opted to use secure messaging. The survey collected demographic data, assessed computer and health literacy, and secure messaging use. Analyses conducted on survey data include frequencies and proportions, chi-square tests, and one-way analysis of variance. Results The majority of respondents (N=819) reported using secure messaging 6 months or longer (n=499, 60.9%). They reported secure messaging to be helpful for completing medication refills (n=546, 66.7%), managing appointments (n=343, 41.9%), looking up test results (n=350, 42.7%), and asking health-related questions (n=340, 41.5%). Notably, some respondents reported using secure messaging to address sensitive health topics (n=67, 8.2%). Survey responses indicated that younger age (P=.039) and higher levels of education (P=.025) and income (P=.003) were associated with more frequent use of secure messaging. Females were more likely to report using secure messaging more often, compared with their male counterparts (P=.098). Minorities were more likely to report using secure messaging more often, at least once a month, compared with nonminorities (P=.086). Individuals with higher levels of health literacy reported more frequent use of secure messaging (P=.007), greater satisfaction (P=.002), and indicated that secure messaging is a useful (P=.002) and easy-to-use (P≤.001) communication tool, compared with individuals with lower reported health literacy. Many respondents (n=328, 40.0%) reported that they would like to receive education and/or felt other veterans would benefit from education on how to access and use the electronic patient portal and secure messaging (n=652, 79.6%). Conclusions Survey findings validated qualitative findings found in previous research, such that veterans perceive secure email messaging as a useful tool for communicating with health care teams. To maximize sustained utilization of secure email messaging, marketing, education, skill building, and system modifications are needed. These findings can inform ongoing efforts to promote the sustained use of this electronic tool to support for patient-provider communication. PMID:26690761

  12. THE COS-HALOS SURVEY: RATIONALE, DESIGN, AND A CENSUS OF CIRCUMGALACTIC NEUTRAL HYDROGEN

    SciTech Connect

    Tumlinson, Jason; Thom, Christopher; Sembach, Kenneth R.; Werk, Jessica K.; Prochaska, J. Xavier; Davé, Romeel; Oppenheimer, Benjamin D.; Ford, Amanda Brady; O'Meara, John M.; Peeples, Molly S.; Weinberg, David H.

    2013-11-01

    We present the design and methods of the COS-Halos survey, a systematic investigation of the gaseous halos of 44 z = 0.15-0.35 galaxies using background QSOs observed with the Cosmic Origins Spectrograph aboard the Hubble Space Telescope. This survey has yielded 39 spectra of z{sub em} ≅ 0.5 QSOs with S/N ∼10-15 per resolution element. The QSO sightlines pass within 150 physical kpc of the galaxies, which span early and late types over stellar mass log M{sub *}/M{sub ☉} = 9.5-11.5. We find that the circumgalactic medium exhibits strong H I, averaging ≅ 1 Å in Lyα equivalent width out to 150 kpc, with 100% covering fraction for star-forming galaxies and 75% covering for passive galaxies. We find good agreement in column densities between this survey and previous studies over similar range of impact parameter. There is weak evidence for a difference between early- and late-type galaxies in the strength and distribution of H I. Kinematics indicate that the detected material is bound to the host galaxy, such that ∼> 90% of the detected column density is confined within ±200 km s{sup –1} of the galaxies. This material generally exists well below the halo virial temperatures at T ∼< 10{sup 5} K. We evaluate a number of possible origin scenarios for the detected material, and in the end favor a simple model in which the bulk of the detected H I arises in a bound, cool, low-density photoionized diffuse medium that is generic to all L* galaxies and may harbor a total gaseous mass comparable to galactic stellar masses.

  13. Changes in depth occupied by Great Lakes lake whitefish populations and the influence of survey design

    USGS Publications Warehouse

    Rennie, Michael D.; Weidel, Brian C.; Claramunt, Randy; Dunlob, Erin S.

    2015-01-01

    Understanding fish habitat use is important in determining conditions that ultimately affect fish energetics, growth and reproduction. Great Lakes lake whitefish (Coregonus clupeaformis) have demonstrated dramatic changes in growth and life history traits since the appearance of dreissenid mussels in the Great Lakes, but the role of habitat occupancy in driving these changes is poorly understood. To better understand temporal changes in lake whitefish depth of capture (Dw), we compiled a database of fishery-independent surveys representing multiple populations across all five Laurentian Great Lakes. By demonstrating the importance of survey design in estimating Dw, we describe a novel method for detecting survey-based bias in Dw and removing potentially biased data. Using unbiased Dw estimates, we show clear differences in the pattern and timing of changes in lake whitefish Dw between our reference sites (Lake Superior) and those that have experienced significant benthic food web changes (lakes Michigan, Huron, Erie and Ontario). Lake whitefish Dw in Lake Superior tended to gradually shift to shallower waters, but changed rapidly in other locations coincident with dreissenid establishment and declines in Diporeia densities. Almost all lake whitefish populations that were exposed to dreissenids demonstrated deeper Dw following benthic food web change, though a subset of these populations subsequently shifted to more shallow depths. In some cases in lakes Huron and Ontario, shifts towards more shallow Dw are occurring well after documented Diporeia collapse, suggesting the role of other drivers such as habitat availability or reliance on alternative prey sources.

  14. Attitudes towards the sharing of genetic information with at-risk relatives: results of a quantitative survey.

    PubMed

    Heaton, Timothy J; Chico, Victoria

    2016-01-01

    To investigate public attitudes towards receiving genetic information arising from a test on a relative, 955 University of Sheffield students and staff were surveyed using disease vignettes. Strength of attitude was measured on whether, in the event of relevant information being discovered, they, as an at-risk relative, would want to be informed, whether the at-risk relative's interest should override proband confidentiality, and, if they had been the proband, willingness to give up confidentiality to inform such relatives. Results indicated considerably more complexity to the decision-making than simple statistical risk. Desire for information only slightly increased with risk of disease manifestation [log odds 0.05 (0.04, 0.06) per percentage point increase in manifestation risk]. Condition preventability was the primary factor increasing desire [modifiable baseline, non-preventable log odds -1.74 (-2.04, -1.44); preventable 0.64 (0.34, 0.95)]. Disease seriousness also increased desire [serious baseline, non-serious log odds -0.89 (-1.19, -0.59); fatal 0.55 (0.25, 0.86)]. Individuals with lower education levels exhibited much greater desire to be informed [GCSE log odds 1.67 (0.64, 2.66)]. Age did not affect desire. Our findings suggest that attitudes were influenced more by disease characteristics than statistical risk. Respondents generally expressed strong attitudes demonstrating that this was not an issue which people felt ambivalent about. We provide estimates of the British population in favour/against disclosure for various disease scenarios. PMID:26612611

  15. Utility FGD survey, January--December 1989. Volume 2, Design performance data for operating FGD systems: Part 2

    SciTech Connect

    Hance, S.L.; McKibben, R.S.; Jones, F.M.

    1992-03-01

    This is Volume 2 part 2, of the Utility flue gas desulfurization (FGD) Survey report, which is generated by a computerized data base management system, represents a survey of operational and planned domestic utility flue gas desulfurization (FGD) systems. It summarizes information contributed by the utility industry, system and equipment suppliers, system designers, research organizations, and regulatory agencies. The data cover system design, fuel characteristics, operating history, and actual system performance. Also included is a unit-by-unit discussion of problems and solutions associated with the boilers, scrubbers, and FGD systems. This volume particularly contains basic design and performance data.

  16. Challenges and strategies in the administration of a population based mortality follow-back survey design

    PubMed Central

    2013-01-01

    Population-based mortality follow-back survey designs have been used to collect information concerning end-of-life care from bereaved family members in several countries. In Canada, this design was recently employed to gather population-based information about the end-of-life care experience among adults in Nova Scotia as perceived by the decedent's family. In this article we describe challenges that emerged during the implementation of the study design and discuss resolutions strategies to help overcome them. Challenges encountered included the inability to directly contact potential participants, difficulties ascertaining eligibility, mailing strategy complications and the overall effect of these issues on response rate and subsequent sample size. Although not all challenges were amenable to resolution, strategies implemented proved beneficial to the overall process and resulted in surpassing the targeted sample size. The inability to directly contact potential participants is an increasing reality and limitations associated with this process best acknowledged during study development. Future studies should also consider addressing participant concerns pertaining to their eligibility and use of a more cost effective mailing strategy. PMID:23919380

  17. Sampling design for the Birth in Brazil: National Survey into Labor and Birth.

    PubMed

    Vasconcellos, Mauricio Teixeira Leite de; Silva, Pedro Luis do Nascimento; Pereira, Ana Paula Esteves; Schilithz, Arthur Orlando Correa; Souza Junior, Paulo Roberto Borges de; Szwarcwald, Celia Landmann

    2014-08-01

    This paper describes the sample design for the National Survey into Labor and Birth in Brazil. The hospitals with 500 or more live births in 2007 were stratified into: the five Brazilian regions; state capital or not; and type of governance. They were then selected with probability proportional to the number of live births in 2007. An inverse sampling method was used to select as many days (minimum of 7) as necessary to reach 90 interviews in the hospital. Postnatal women were sampled with equal probability from the set of eligible women, who had entered the hospital in the sampled days. Initial sample weights were computed as the reciprocals of the sample inclusion probabilities and were calibrated to ensure that total estimates of the number of live births from the survey matched the known figures obtained from the Brazilian System of Information on Live Births. For the two telephone follow-up waves (6 and 12 months later), the postnatal woman's response probability was modelled using baseline covariate information in order to adjust the sample weights for nonresponse in each follow-up wave. PMID:25167189

  18. Sampling design for an integrated socioeconomic and ecological survey by using satellite remote sensing and ordination

    PubMed Central

    Binford, Michael W.; Lee, Tae Jeong; Townsend, Robert M.

    2004-01-01

    Environmental variability is an important risk factor in rural agricultural communities. Testing models requires empirical sampling that generates data that are representative in both economic and ecological domains. Detrended correspondence analysis of satellite remote sensing data were used to design an effective low-cost sampling protocol for a field study to create an integrated socioeconomic and ecological database when no prior information on ecology of the survey area existed. We stratified the sample for the selection of tambons from various preselected provinces in Thailand based on factor analysis of spectral land-cover classes derived from satellite data. We conducted the survey for the sampled villages in the chosen tambons. The resulting data capture interesting variations in soil productivity and in the timing of good and bad years, which a purely random sample would likely have missed. Thus, this database will allow tests of hypotheses concerning the effect of credit on productivity, the sharing of idiosyncratic risks, and the economic influence of environmental variability. PMID:15254298

  19. SURVEY DESIGN FOR SPECTRAL ENERGY DISTRIBUTION FITTING: A FISHER MATRIX APPROACH

    SciTech Connect

    Acquaviva, Viviana; Gawiser, Eric; Bickerton, Steven J.; Grogin, Norman A.; Guo Yicheng; Lee, Seong-Kook

    2012-04-10

    The spectral energy distribution (SED) of a galaxy contains information on the galaxy's physical properties, and multi-wavelength observations are needed in order to measure these properties via SED fitting. In planning these surveys, optimization of the resources is essential. The Fisher Matrix (FM) formalism can be used to quickly determine the best possible experimental setup to achieve the desired constraints on the SED-fitting parameters. However, because it relies on the assumption of a Gaussian likelihood function, it is in general less accurate than other slower techniques that reconstruct the probability distribution function (PDF) from the direct comparison between models and data. We compare the uncertainties on SED-fitting parameters predicted by the FM to the ones obtained using the more thorough PDF-fitting techniques. We use both simulated spectra and real data, and consider a large variety of target galaxies differing in redshift, mass, age, star formation history, dust content, and wavelength coverage. We find that the uncertainties reported by the two methods agree within a factor of two in the vast majority ({approx}90%) of cases. If the age determination is uncertain, the top-hat prior in age used in PDF fitting to prevent each galaxy from being older than the universe needs to be incorporated in the FM, at least approximately, before the two methods can be properly compared. We conclude that the FM is a useful tool for astronomical survey design.

  20. Developing an efficient modelling and data presentation strategy for ATDEM system comparison and survey design

    NASA Astrophysics Data System (ADS)

    Combrinck, Magdel

    2015-10-01

    Forward modelling of airborne time-domain electromagnetic (ATDEM) responses is frequently used to compare systems and design surveys for optimum detection of expected mineral exploration targets. It is a challenging exercise to display and analyse the forward modelled responses due to the large amount of data generated for three dimensional models as well as the system dependent nature of the data. I propose simplifying the display of ATDEM responses through using the dimensionless quantity of signal-to-noise ratios (signal:noise) instead of respective system units. I also introduce the concept of a three-dimensional signal:noise nomo-volume as an efficient tool to visually present and analyse large amounts of data. The signal:noise nomo-volume is a logical extension of the two-dimensional conductance nomogram. It contains the signal:noise values of all system time channels and components for various target depths and conductances integrated into a single interactive three-dimensional image. Responses are calculated over a complete survey grid and therefore include effects of system and target geometries. The user can interactively select signal:noise cut-off values on the nomo-volume and is able to perform visual comparisons between various system and target responses. The process is easy to apply and geophysicists with access to forward modelling airborne electromagnetic (AEM) and three-dimensional imaging software already possess the tools required to produce and analyse signal:noise nomo-volumes.

  1. A survey of pulse shape options for a revised plastic ablator ignition design

    SciTech Connect

    Clark, D. S.; Milovich, J. L.; Hinkel, D. E.; Salmonson, J. D.; Peterson, J. L.; Berzak Hopkins, L. F.; Eder, D. C.; Haan, S. W.; Jones, O. S.; Marinak, M. M.; Robey, H. F.; Smalyuk, V. A.; Weber, C. R.

    2014-11-15

    Recent experimental results using the “high foot” pulse shape for inertial confinement fusion ignition experiments on the National Ignition Facility (NIF) [Moses et al., Phys. Plasmas 16, 041006 (2009)] have shown encouraging progress compared to earlier “low foot” experiments. These results strongly suggest that controlling ablation front instability growth can significantly improve implosion performance even in the presence of persistent, large, low-mode distortions. Simultaneously, hydrodynamic growth radiography experiments have confirmed that ablation front instability growth is being modeled fairly well in NIF experiments. It is timely then to combine these two results and ask how current ignition pulse shapes could be modified to improve one-dimensional implosion performance while maintaining the stability properties demonstrated with the high foot. This paper presents such a survey of pulse shapes intermediate between the low and high foot extremes in search of an intermediate foot optimum. Of the design space surveyed, it is found that a higher picket version of the low foot pulse shape shows the most promise for improved compression without loss of stability.

  2. Designing HIGH-COST Medicine Hospital Surveys, Health Planning, and the Paradox of Progressive Reform

    PubMed Central

    2010-01-01

    Inspired by social medicine, some progressive US health reforms have paradoxically reinforced a business model of high-cost medical delivery that does not match social needs. In analyzing the financial status of their areas’ hospitals, for example, city-wide hospital surveys of the 1910s through 1930s sought to direct capital investments and, in so doing, control competition and markets. The 2 national health planning programs that ran from the mid-1960s to the mid-1980s continued similar strategies of economic organization and management, as did the so-called market reforms that followed. Consequently, these reforms promoted large, extremely specialized, capital-intensive institutions and systems at the expense of less complex (and less costly) primary and chronic care. The current capital crisis may expose the lack of sustainability of such a model and open up new ideas and new ways to build health care designed to meet people's health needs. PMID:20019312

  3. Designing HIGH-COST medicine: hospital surveys, health planning, and the paradox of progressive reform.

    PubMed

    Perkins, Barbara Bridgman

    2010-02-01

    Inspired by social medicine, some progressive US health reforms have paradoxically reinforced a business model of high-cost medical delivery that does not match social needs. In analyzing the financial status of their areas' hospitals, for example, city-wide hospital surveys of the 1910s through 1930s sought to direct capital investments and, in so doing, control competition and markets. The 2 national health planning programs that ran from the mid-1960s to the mid-1980s continued similar strategies of economic organization and management, as did the so-called market reforms that followed. Consequently, these reforms promoted large, extremely specialized, capital-intensive institutions and systems at the expense of less complex (and less costly) primary and chronic care. The current capital crisis may expose the lack of sustainability of such a model and open up new ideas and new ways to build health care designed to meet people's health needs. PMID:20019312

  4. Hot rocket plume experiment - Survey and conceptual design. [of rhenium-iridium bipropellants

    NASA Technical Reports Server (NTRS)

    Millard, Jerry M.; Luan, Taylor W.; Dowdy, Mack W.

    1992-01-01

    Attention is given to a space-borne engine plume experiment study to fly an experiment which will both verify and quantify the reduced contamination from advanced rhenium-iridium earth-storable bipropellant rockets (hot rockets) and provide a correlation between high-fidelity, in-space measurements and theoretical plume and surface contamination models. The experiment conceptual design is based on survey results from plume and contamination technologists throughout the U.S. With respect to shuttle use, cursory investigations validate Hitchhiker availability and adaptability, adequate remote manipulator system (RMS) articulation and dynamic capability, acceptable RMS attachment capability, adequate power and telemetry capability, and adequate flight altitude and attitude/orbital capability.

  5. Autonomous Underwater Vehicle Survey Design for Monitoring Carbon Capture and Storage Sites

    NASA Astrophysics Data System (ADS)

    Bull, J. M.; Cevatoglu, M.; Connelly, D.; Wright, I. C.; McPhail, S.; Shitashima, K.

    2013-12-01

    Long-term monitoring of sub-seabed Carbon Capture and Storage (CCS) sites will require systems that are flexible, independent, and have long-endurance. In this presentation we will discuss the utility of autonomous underwater vehicles equipped with different sensor packages in monitoring storage sites. We will present data collected using Autosub AUV, as part of the ECO2 project, from the Sleipner area of the North Sea. The Autosub AUV was equipped with sidescan sonar, an EM2000 multibeam systems, a Chirp sub-bottom profiler, and a variety of chemical sensors. Our presentation will focus on survey design, and the simultaneous use of multiple sensor packages in environmental monitoring on the continental shelf.

  6. Technology transfer with system analysis, design, decision making, and impact (Survey-2000) in acute care hospitals in the United States.

    PubMed

    Hatcher, M

    2001-10-01

    This paper provides the results of the Survey-2000 measuring technology transfer for management information systems in health care. The relationships with systems approaches, user involvement, usersatisfaction, and decision-making were measured and are presented. The survey also measured the levels Internet and Intranet presents in acute care hospitals, which will be discussed in future articles. The depth of the survey includes e-commerce for both business to business and customers. These results are compared, where appropriate, with results from survey 1997 and changes are discussed. This information will provide benchmarks for hospitals to plan their network technology position and to set goals. This is the first of three articles based upon the results of the Srvey-2000. Readers are referred to a prior article by the author that discusses the survey design and provides a tutorial on technology transfer in acute care hospitals. PMID:11508906

  7. S-CANDELS: The Spitzer-Cosmic Assembly Near-Infrared Deep Extragalactic Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Astrophysics Data System (ADS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Dunlop, J. S.; Egami, E.; Faber, S. M.; Ferguson, H. C.; Grogin, N. A.; Hora, J. L.; Huang, J.-S.; Koekemoer, A. M.; Labbé, I.; Wang, Z.

    2015-06-01

    The Spitzer-Cosmic Assembly Deep Near-infrared Extragalactic Legacy Survey (S-CANDELS; PI G.Fazio) is a Cycle 8 Exploration Program designed to detect galaxies at very high redshifts (z\\gt 5). To mitigate the effects of cosmic variance and also to take advantage of deep coextensive coverage in multiple bands by the Hubble Space Telescope (HST) Multi-cycle Treasury Program CANDELS, S-CANDELS was carried out within five widely separated extragalactic fields: the UKIDSS Ultra-deep Survey, the Extended Chandra Deep Field South, COSMOS, the HST Deep Field North, and the Extended Groth Strip. S-CANDELS builds upon the existing coverage of these fields from the Spitzer Extended Deep Survey (SEDS), a Cycle 6 Exploration Program, by increasing the integration time from SEDS’ 12 hr to a total of 50 hr but within a smaller area, 0.16 deg2. The additional depth significantly increases the survey completeness at faint magnitudes. This paper describes the S-CANDELS survey design, processing, and publicly available data products. We present Infrared Array Camera (IRAC) dual-band 3.6+4.5 μ {{m}} catalogs reaching to a depth of 26.5 AB mag. Deep IRAC counts for the roughly 135,000 galaxies detected by S-CANDELS are consistent with models based on known galaxy populations. The increase in depth beyond earlier Spitzer/IRAC surveys does not reveal a significant additional contribution from discrete sources to the diffuse Cosmic Infrared Background (CIB). Thus it remains true that only roughly half of the estimated CIB flux from COBE/DIRBE is resolved.

  8. Design and development of the 3.2 gigapixel camera for the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Kahn, S. M.; Kurita, N.; Gilmore, K.; Nordby, M.; O'Connor, P.; Schindler, R.; Oliver, J.; Van Berg, R.; Olivier, S.; Riot, V.; Antilogus, P.; Schalk, T.; Huffer, M.; Bowden, G.; Singal, J.; Foss, M.

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) is a large aperture, wide-field facility designed to provide deep images of half the sky every few nights. There is only a single instrument on the telescope, a 9.6 square degree visible-band camera, which is mounted close to the secondary mirror, and points down toward the tertiary. The requirements of the LSST camera present substantial technical design challenges. To cover the entire 0.35 to 1 μm visible band, the camera incorporates an array of 189 over-depleted bulk silicon CCDs with 10 μm pixels. The CCDs are assembled into 3 x 3 "rafts", which are then mounted to a silicon carbide grid to achieve a total focal plane flatness of 15 μm p-v. The CCDs have 16 amplifiers per chip, enabling the entire 3.2 Gigapixel image to be read out in 2 seconds. Unlike previous astronomical cameras, a vast majority of the focal plane electronics are housed in the cryostat, which uses a mixed refrigerant Joule-Thompson system to maintain a -100ºC sensor temperature. The shutter mechanism uses a 3 blade stack design and a hall-effect sensor to achieve high resolution and uniformity. There are 5 filters stored in a carousel around the cryostat and the auto changer requires a dual guide system to control its position due to severe space constraints. This paper presents an overview of the current state of the camera design and development plan.

  9. Nonexperimental Quantitative Research and Its Role in Guiding Instruction

    ERIC Educational Resources Information Center

    Cook, Bryan G.; Cook, Lysandra

    2008-01-01

    Different research designs answer different questions. Educators cannot use nonexperimental quantitative research designs, such as descriptive surveys and correlational research, to determine definitively that an intervention causes improved student outcomes and is an evidence-based practice. However, such research can (a) inform educators about a…

  10. Mechanical Design of NESSI: New Mexico Tech Extrasolar Spectroscopic Survey Instrument

    NASA Technical Reports Server (NTRS)

    Santoro, Fernando G.; Olivares, Andres M.; Salcido, Christopher D.; Jimenez, Stephen R.; Jurgenson, Colby A.; Hrynevych, Michael A.; Creech-Eakman, Michelle J.; Boston, Penny J.; Schmidt, Luke M.; Bloemhard, Heather; Rodeheffer, Dan; Vaive, Genevieve; Vasisht, Gautam; Swain, Mark R.; Deroo, Pieter

    2011-01-01

    NESSI: the New Mexico Tech Extrasolar Spectroscopic Survey Instrument is a ground-based multi-object spectrograph that operates in the near-infrared. It will be installed on one of the Nasmyth ports of the Magdalena Ridge Observatory (MRO) 2.4-meter Telescope sited in the Magdalena Mountains, about 48 km west of Socorro-NM. NESSI operates stationary to the telescope fork so as not to produce differential flexure between internal opto-mechanical components during or between observations. An appropriate mechanical design allows the instrument alignment to be highly repeatable and stable for both short and long observation timescales, within a wide-range of temperature variation. NESSI is optically composed of a field lens, a field de-rotator, re-imaging optics, an auto-guider and a Dewar spectrograph that operates at LN2 temperature. In this paper we report on NESSI's detailed mechanical and opto-mechanical design, and the planning for mechanical construction, assembly, integration and verification.

  11. Quantitative Evaluation of Tissue Surface Adaption of CAD-Designed and 3D Printed Wax Pattern of Maxillary Complete Denture

    PubMed Central

    Chen, Hu; Wang, Han; Lv, Peijun; Wang, Yong; Sun, Yuchun

    2015-01-01

    Objective. To quantitatively evaluate the tissue surface adaption of a maxillary complete denture wax pattern produced by CAD and 3DP. Methods. A standard edentulous maxilla plaster cast model was used, for which a wax pattern of complete denture was designed using CAD software developed in our previous study and printed using a 3D wax printer, while another wax pattern was manufactured by the traditional manual method. The cast model and the two wax patterns were scanned in the 3D scanner as “DataModel,” “DataWaxRP,” and “DataWaxManual.” After setting each wax pattern on the plaster cast, the whole model was scanned for registration. After registration, the deviations of tissue surface between “DataModel” and “DataWaxRP” and between “DataModel” and “DataWaxManual” were measured. The data was analyzed by paired t-test. Results. For both wax patterns produced by the CAD&RP method and the manual method, scanning data of tissue surface and cast surface showed a good fit in the majority. No statistically significant (P > 0.05) difference was observed between the CAD&RP method and the manual method. Conclusions. Wax pattern of maxillary complete denture produced by the CAD&3DP method is comparable with traditional manual method in the adaption to the edentulous cast model. PMID:26583108

  12. The Hawk-I UDS and GOODS Survey (HUGS): Survey design and deep K-band number counts

    NASA Astrophysics Data System (ADS)

    Fontana, A.; Dunlop, J. S.; Paris, D.; Targett, T. A.; Boutsia, K.; Castellano, M.; Galametz, A.; Grazian, A.; McLure, R.; Merlin, E.; Pentericci, L.; Wuyts, S.; Almaini, O.; Caputi, K.; Chary, R.-R.; Cirasuolo, M.; Conselice, C. J.; Cooray, A.; Daddi, E.; Dickinson, M.; Faber, S. M.; Fazio, G.; Ferguson, H. C.; Giallongo, E.; Giavalisco, M.; Grogin, N. A.; Hathi, N.; Koekemoer, A. M.; Koo, D. C.; Lucas, R. A.; Nonino, M.; Rix, H. W.; Renzini, A.; Rosario, D.; Santini, P.; Scarlata, C.; Sommariva, V.; Stark, D. P.; van der Wel, A.; Vanzella, E.; Wild, V.; Yan, H.; Zibetti, S.

    2014-10-01

    We present the results of a new, ultra-deep, near-infrared imaging survey executed with the Hawk-I imager at the ESO VLT, of which we make all the data (images and catalog) public. This survey, named HUGS (Hawk-I UDS and GOODS Survey), provides deep, high-quality imaging in the K and Y bands over the portions of the UKIDSS UDS and GOODS-South fields covered by the CANDELS HST WFC3/IR survey. In this paper we describe the survey strategy, the observational campaign, the data reduction process, and the data quality. We show that, thanks to exquisite image quality and extremely long exposure times, HUGS delivers the deepest K-band images ever collected over areas of cosmological interest, and in general ideally complements the CANDELS data set in terms of image quality and depth. In the GOODS-S field, the K-band observations cover the whole CANDELS area with a complex geometry made of 6 different, partly overlapping pointings, in order to best match the deep and wide areas of CANDELS imaging. In the deepest region (which includes most of the Hubble Ultra Deep Field) exposure times exceed 80 hours of integration, yielding a 1 - σ magnitude limit per square arcsec of ≃28.0 AB mag. The seeing is exceptional and homogeneous across the various pointings, confined to the range 0.38-0.43 arcsec. In the UDS field the survey is about one magnitude shallower (to match the correspondingly shallower depth of the CANDELS images) but includes also Y-band band imaging (which, in the UDS, was not provided by the CANDELS WFC3/IR imaging). In the K-band, with an average exposure time of 13 hours, and seeing in the range 0.37-0.43 arcsec, the 1 - σ limit per square arcsec in the UDS imaging is ≃27.3 AB mag. In the Y-band, with an average exposure time ≃8 h, and seeing in the range 0.45-0.5 arcsec, the imaging yields a 1 - σ limit per square arcsec of ≃28.3 AB mag. We show that the HUGS observations are well matched to the depth of the CANDELS WFC3/IR data, since the majority of even the faintest galaxies detected in the CANDELS H-band images are also detected in HUGS. Finally we present the K-band galaxy number counts produced by combining the HUGS data from the two fields. We show that the slope of the number counts depends sensitively on the assumed distribution of galaxy sizes, with potential impact on the estimated extra-galactic background light. All the HUGS images and catalogues are made publicly available at the ASTRODEEP website (http://www.astrodeep.eu) as well as from the ESO archive.Full Table 3 is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/570/A11

  13. A systematic quantitative approach to rational drug design and discovery of novel human carbonic anhydrase IX inhibitors.

    PubMed

    Sethi, Kalyan K; Verma, Saurabh M

    2014-08-01

    Drug design involves the design of small molecules that are complementary in shape and charge to the biomolecular target with which they interact and therefore will bind to it. Three-dimensional quantitative structure-activity relationship (3D-QSAR) studies were performed for a series of carbonic anhydrase IX inhibitors using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) techniques with the help of SYBYL 7.1 software. The large set of 36 different aromatic/heterocyclic sulfamates carbonic anhydrase (CA, EC 4.2.1.1) inhibitors, such as hCA IX, was chosen for this study. The conventional ligand-based 3D-QSAR studies were performed based on the low energy conformations employing database alignment rule. The ligand-based model gave q(2) values 0.802 and 0.829 and r(2) values 1.000 and 0.994 for CoMFA and CoMSIA, respectively, and the predictive ability of the model was validated. The predicted r(2) values are 0.999 and 0.502 for CoMFA and CoMSIA, respectively. SEA (steric, electrostatic, hydrogen bond acceptor) of CoMSIA has the significant contribution for the model development. The docking of inhibitors into hCA IX active site using Glide XP (Schrödinger) software revealed the vital interactions and binding conformation of the inhibitors. The CoMFA and CoMSIA field contour maps are well in agreement with the structural characteristics of the binding pocket of hCA IX active site, which suggests that the information rendered by 3D-QSAR models and the docking interactions can provide guidelines for the development of improved hCA IX inhibitors as leads for various types of metastatic cancers including those of cervical, renal, breast and head and neck origin. PMID:24090419

  14. Survey of muscle relaxant effects management with a kinemyographic-based data archiving system: a retrospective quantitative and contextual quality control approach.

    PubMed

    Motamed, Cyrus; Bourgain, Jean Louis; D'Hollander, Alain

    2013-12-01

    In a retrospective quality control study of muscle relaxant management, we assessed unbiased files provided by an automatic archiving system using quantitative monitoring generated by a kinemyographic transducer and suggest improvements for a possible future design. 200 randomly selected files were double checked to collect the values of twitch height ratio (THr), train of four ratio (TOFr) and TOF count in four periods: references values acquisition (REF), maximal level of paralysis, paralysis maintenance, pre-tracheal extubation residual paralysis assessment (RPA). The parameter values were selected according to period-specific predefined rules. A quantitative quality control was based upon standardized cut-offs values. A contextual quality control was based upon the detection of "difficult-to-interpret" episodes. Results were expressed on a descriptive basis only. For the REF period, THrs and TOFrs were lacking in, respectively, 47 and 18 of the 200 recordings analysed. A starting TOFr above 0.90 existed in 119 files. Concomitant THrs and TOFrs >0.90 were evidenced 93 times. During RPA period, TOFr >0.90 was recorded on 82 occasions. The optimal combination of THr >0.80 and TOFr >0.90 was detected in 30 files only. Presence of "difficult to interpret" episodes started with 18 files for the REF period and increased to 42, 86 and 52 in the subsequent ones most of them probably related to the absence of initial calibration procedure. In the real life conditions, a near to optimal quality control is not always observable with the quantitative neuromuscular monitoring studied. To improve the NMT monitoring, the calibration of the sensor should be performed vigorously by the anaesthesia provider and the quality of this calibration must be displayed on the screen of the monitor. PMID:23838899

  15. Utility FGD Survey, January--December 1989. Volume 2, Design performance data for operating FGD systems, Part 1

    SciTech Connect

    Hance, S.L.; McKibben, R.S.; Jones, F.M.

    1992-03-01

    The Utility flue gas desulfurization (FGD) Survey report, which is generated by a computerized data base management system, represents a survey of operational and planned domestic utility flue gas desulfurization (FGD) systems. It summarizes information contributed by the utility industry, system and equipment suppliers, system designers, research organizations, and regulatory agencies. The data cover system design, fuel characteristics, operating history, and actual system performance. Also included is a unit-by-unit discussion of problems and solutions associated with the boilers, scrubbers, and FGD systems. The development status (operational, under construction, or in the planning stages), system supplier, process, waste disposal practice, and regulatory class are tabulated alphabetically by utility company.

  16. Software Design Aspects and First Test Results of VLT Survey Telescope Control System

    NASA Astrophysics Data System (ADS)

    Brescia, M.; Schipani, P.; Marty, L.; Capaccioli, M.

    2006-08-01

    The 2.6 m VLT Survey Telescope (VST) is going to be installed at Cerro Paranal (Chile) as a powerful survey instrument for the ESO VLT. The tightest requirements to be respected for such a telescope, (large field of view of 1x1, pixel scale of 0.21 arcsec/pixel, and hosted in a one of the best worldwide astronomical sites), are basically very high performances of active optics and autoguiding systems and an excellent axes control, in order to obtain the best overall image quality of the telescope. The VST active optics software must basically provide the analysis of the image coming from the 10x10 subpupils Shack Hartmann wavefront sensor and the calculation of primary mirror forces and secondary mirror displacements to correct the intrinsic aberrations of the optical system and the ones originated for thermal or gravity reasons. The algorithm to select the guide star depends on the specific geometry of the adapter system. The adapter of the VST hosts many devices handled by the overall telescope control software: a probe system to select the guide star realized with motions in polar coordinates, a pickup mirror to fold the light to the image analysis and guiding cameras, a selectable reference light system and a focusing device. All these devices deeply interface with autoguiding, active optics and field rotation compensation systems. A reverse engineering approach mixed to the integration of new specific solutions has been fundamental to match the ESO commitments in terms of software re-use, in order to smoothen the integration of a new telescope designed and built by an external institute in the ESO environment. The control software architecture, the simulation code to validate the results and the status of work are here described. This paper includes also first results of preliminary tracking tests performed at the VST integration site for azimuth, altitude and rotator axes, that already match system quality requirements.

  17. The Bochum survey of the southern Galactic disk: I. Survey design and first results on 50 square degrees monitored in 2011

    NASA Astrophysics Data System (ADS)

    Haas, M.; Hackstein, M.; Ramolla, M.; Drass, H.; Watermann, R.; Lemke, R.; Chini, R.

    2012-10-01

    We are monitoring a 6° wide stripe along the southern Galactic disk simultaneously in the r and i bands, using a robotic 15-cm twin telescope of the Universitätsternwarte Bochum near Cerro Armazones in Chile. Utilising the telescope's 2.7° field of view, the survey aims at observing a mosaic of 268 fields once per month and to monitor dedicated fields once per night. The survey reaches a sensitivity from 10m down to 18m (AB system), with a completeness limit of r˜15.5m and i˜14.5m which - due to the instrumental pixel size of 2.4 arcsec - refers to stars separated by >3 arcsec. This brightness range is ideally suited to examine the intermediately bright stellar population supposed to be saturated in deep variability surveys with large telescopes. To connect to deep surveys or to explore faint long term variables, coadded images of several nights reach a depth of ˜20m. The astrometric accuracy is better than 1 arcsec, as determined from the overlap of neighbouring fields. We describe the survey design, the data properties and our procedures to derive the light curves and to extract variable stars. We present a list of ˜2200 variable stars identified in 50 square degrees with 50-80 observations between May and October 2011. For bright stars the variability amplitude A reaches down to A˜0.05m, while at the faint end variations of A>1m are detected. About 200 stars were known to be variable, and their amplitudes and periods - as far as determinable from our six month monitoring - agree with literature values, demonstrating the performance of the Bochum Galactic Disk Survey.

  18. Optimal design of a lagrangian observing system for hydrodynamic surveys in coastal areas

    NASA Astrophysics Data System (ADS)

    Cucco, Andrea; Quattrocchi, Giovanni; Antognarelli, Fabio; Satta, Andrea; Maicu, Francesco; Ferrarin, Christian; Umgiesser, Georg

    2014-05-01

    The optimization of ocean observing systems is a pressing need for scientific research. In particular, the improvement of ocean short-term observing networks is achievable by reducing the cost-benefit ratio of the field campaigns and by increasing the quality of measurements. Numerical modeling is a powerful tool for determining the appropriateness of a specific observing system and for optimizing the sampling design. This is particularly true when observations are carried out in coastal areas and lagoons where, the use satellites is prohibitive due to the water shallowness. For such areas, numerical models are the most efficient tool both to provide a preliminary assess of the local physical environment and to make short -term predictions above its change. In this context, a test case experiment was carried out within an enclosed shallow water areas, the Cabras Lagoon (Sardinia, Italy). The aim of the experiment was to explore the optimal design for a field survey based on the use of coastal lagrangian buoys. A three-dimensional hydrodynamic model based on the finite element method (SHYFEM3D, Umgiesser et al., 2004) was implemented to simulate the lagoon water circulation. The model domain extent to the whole Cabras lagoon and to the whole Oristano Gulf, including the surrounding coastal area. Lateral open boundary conditions were provided by the operational ocean model system WMED and only wind forcing, provided by SKIRON atmospheric model (Kallos et al., 1997), was considered as surface boundary conditions. The model was applied to provide a number of ad hoc scenarios and to explore the efficiency of the short-term hydrodynamic survey. A first field campaign was carried out to investigate the lagrangian circulation inside the lagoon under the main wind forcing condition (Mistral wind from North-West). The trajectories followed by the lagrangian buoys and the estimated lagrangian velocities were used to calibrate the model parameters and to validate the simulation results. A set of calibration runs were performed and the model accuracy in reproducing the surface circulation were defined. Therefore, a numerical simulation was conducted to predict the wind induced lagoon water circulation and the paths followed by numerical particles inside the lagoon domain. The simulated particles paths was analyzed and the optimal configuration for the buoys deployment was designed in real-time. The selected deployment geometry was then tested during a further field campaign. The obtained dataset revealed that the chosen measurement strategy provided a near-synoptic survey with the longest records for the considered specific observing experiment. This work is aimed to emphasize the mutual usefulness of observations and numerical simulations in coastal ocean applications and it proposes an efficient approach to harmonize different expertise toward the investigation of a given specific research issue. A Cucco, M Sinerchia, A Ribotti, A Olita, L Fazioli, A Perilli, B Sorgente, M Borghini, K Schroeder, R Sorgente. 2012. A high-resolution real-time forecasting system for predicting the fate of oil spills in the Strait of Bonifacio (western Mediterranean Sea). Marine Pollution Bulletin. 64. 6, 1186-1200. Kallos, G., Nickovic, S., Papadopoulos, A., Jovic, D., Kakaliagou, O., Misirlis, N., Boukas, L., Mimikou, N., G., S., J., P., Anadranistakis, E., and Manousakis, M.. 1997. The regional weather forecasting system Skiron: An overview, in: Proceedings of the Symposium on Regional Weather Prediction on Parallel Computer Environments, 109-122, Athens, Greece. Umgiesser, G., Melaku Canu, D., Cucco, A., Solidoro, C., 2004. A finite element model for the Venice Lagoon. Development, set up, calibration and validation. Journal of Marine Systems 51, 123-145.

  19. Korean Environmental Health Survey in Children and Adolescents (KorEHS-C): survey design and pilot study results on selected exposure biomarkers.

    PubMed

    Ha, Mina; Kwon, Ho-Jang; Leem, Jong-Han; Kim, Hwan-Cheol; Lee, Kee Jae; Park, Inho; Lim, Young-Wook; Lee, Jong-Hyeon; Kim, Yeni; Seo, Ju-Hee; Hong, Soo-Jong; Choi, Youn-Hee; Yu, Jeesuk; Kim, Jeongseon; Yu, Seung-Do; Lee, Bo-Eun

    2014-03-01

    For the first nationwide representative survey on the environmental health of children and adolescents in Korea, we designed the Korean Environmental Health Survey in Children and Adolescents (KorEHS-C) as a two-phase survey and planned a sampling strategy that would represent the whole population of Korean children and adolescents, based on the school unit for the 6-19 years age group and the household unit for the 5 years or less age group. A pilot study for 351 children and adolescents aged 6 to 19 years in elementary, middle, and high school of two cities was performed to validate several measurement methods and tools, as well as to test their feasibility, and to elaborate the protocols used throughout the survey process. Selected exposure biomarkers, i.e., lead, mercury, cadmium in blood, and bisphenol A, metabolites of diethylhexyl phthalate and di-n-butyl phthalate and cotinine in urine were analyzed. We found that the levels of blood mercury (Median: 1.7 ug/L) and cadmium (Median: 0.30 ug/L) were much higher than those of subjects in Germany and the US, while metabolites of phthalates and bisphenol A showed similar levels and tendencies by age; the highest levels of phthalate metabolites and bisphenol A occurred in the youngest group of children. Specific investigations to elucidate the exposure pathways of major environmental exposure need to be conducted, and the KorEHS-C should cover as many potential environmental hazards as possible. PMID:23831304

  20. A two-phase sampling design for increasing detections of rare species in occupancy surveys

    USGS Publications Warehouse

    Pacifici, Krishna; Dorazio, Robert M.; Dorazio, Michael J.

    2012-01-01

    1. Occupancy estimation is a commonly used tool in ecological studies owing to the ease at which data can be collected and the large spatial extent that can be covered. One major obstacle to using an occupancy-based approach is the complications associated with designing and implementing an efficient survey. These logistical challenges become magnified when working with rare species when effort can be wasted in areas with none or very few individuals. 2. Here, we develop a two-phase sampling approach that mitigates these problems by using a design that places more effort in areas with higher predicted probability of occurrence. We compare our new sampling design to traditional single-season occupancy estimation under a range of conditions and population characteristics. We develop an intuitive measure of predictive error to compare the two approaches and use simulations to assess the relative accuracy of each approach. 3. Our two-phase approach exhibited lower predictive error rates compared to the traditional single-season approach in highly spatially correlated environments. The difference was greatest when detection probability was high (0·75) regardless of the habitat or sample size. When the true occupancy rate was below 0·4 (0·05-0·4), we found that allocating 25% of the sample to the first phase resulted in the lowest error rates. 4. In the majority of scenarios, the two-phase approach showed lower error rates compared to the traditional single-season approach suggesting our new approach is fairly robust to a broad range of conditions and design factors and merits use under a wide variety of settings. 5. Synthesis and applications. Conservation and management of rare species are a challenging task facing natural resource managers. It is critical for studies involving rare species to efficiently allocate effort and resources as they are usually of a finite nature. We believe our approach provides a framework for optimal allocation of effort while maximizing the information content of the data in an attempt to provide the highest conservation value per unit of effort.

  1. Characteristics of Designated Drivers and their Passengers from the 2007 National Roadside Survey in the United States

    PubMed Central

    Bergen, Gwen; Yao, Jie; Shults, Ruth A.; Romano, Eduardo; Lacey, John

    2015-01-01

    Objective The objectives of this study were to estimate the prevalence of designated driving in the United States, compare these results with those from the 1996 National Roadside Survey, and explore the demographic, drinking, and trip characteristics of both designated drivers and their passengers. Methods The data used were from the 2007 National Roadside Survey which randomly stopped drivers, administered breath tests for alcohol, and administered a questionnaire to drivers and front seat passengers. Results Almost a third (30%) of nighttime drivers reported being designated drivers, with 84% of them having a blood alcohol concentration of zero. Drivers who were more likely to be designated drivers were those with a blood alcohol concentration that was over zero but still legal, who were under 35 years of age, who were African-American, Hispanic or Asian, and whose driving trip originated at a bar, tavern, or club. Over a third of passengers of designated drivers reported consuming an alcoholic drink the day of the survey compared with a fifth of passengers of non-designated drivers. One-fifth of designated driver passengers who reported drinking consumed five or more drinks that day. Conclusions Designated driving is widely used in the United States, with the majority of designated drivers abstaining from drinking alcohol. However as designated driving separates drinking from driving for passengers in a group travelling together, this may encourage passengers to binge drink, which is associated with many adverse health consequences in addition to those arising from alcohol-impaired driving. Designated driving programs and campaigns, although not proven to be effective when used alone, can complement proven effective interventions to help reduce excessive drinking and alcohol-impaired driving. PMID:24372499

  2. SIS Mixer Design for a Broadband Millimeter Spectrometer Suitable for Rapid Line Surveys and Redshift Determinations

    NASA Technical Reports Server (NTRS)

    Rice, F.; Sumner, M.; Zmuidzinas, J.; Hu, R.; LeDuc, H.; Harris, A.; Miller, D.

    2004-01-01

    We present some detail of the waveguide probe and SIS mixer chip designs for a low-noise 180-300 GHz double- sideband receiver with an instantaneous RF bandwidth of 24 GHz. The receiver's single SIS junction is excited by a broadband, fixed-tuned waveguide probe on a silicon substrate. The IF output is coupled to a 6-18 GHz MMIC low- noise preamplifier. Following further amplification, the output is processed by an array of 4 GHz, 128-channel analog autocorrelation spectrometers (WASP 11). The single-sideband receiver noise temperature goal of 70 Kelvin will provide a prototype instrument capable of rapid line surveys and of relatively efficient carbon monoxide (CO) emission line searches of distant, dusty galaxies. The latter application's goal is to determine redshifts by measuring the frequencies of CO line emissions from the star-forming regions dominating the submillimeter brightness of these galaxies. Construction of the receiver has begun; lab testing should begin in the fall. Demonstration of the receiver on the Caltech Submillimeter Observatory (CSO) telescope should begin in spring 2003.

  3. Design, Data Collection, Interview Timing, and Data Editing in the 1995 National Household Education Survey (NHES:95). Working Paper Series.

    ERIC Educational Resources Information Center

    Collins, Mary A.; Brick, J. Michael; Loomis, Laura S.; Nicchitta, Patricia G.; Fleischman, Susan

    The National Household Education Survey (NHES) is a data collection effort of the National Center for Education Statistics that collects and publishes data on the condition of education in the United States. The NHES is designed to provide information on issues that are best addressed by contacting households rather than institutions. It is a…

  4. CONDITION ASSESSMENT FOR THE ESCAMBIA RIVER, FL, WATERSHED: BENTHIC MACROINVERTEBRATE SURVEYS USING A PROBABILISTIC SAMPLING DESIGN (POSTER SESSION)

    EPA Science Inventory

    Probabilistic sampling has been used to assess the condition of estuarine ecosystems, and the use of this survey design approach was examined for a northwest Florida watershed. Twenty-eight lotic sites within the Escambia River, Florida, watershed were randomly selected and visit...

  5. Improving the design of amphibian surveys using soil data: A case study in two wilderness areas

    USGS Publications Warehouse

    Bowen, K.D.; Beever, E.A.; Gafvert, U.B.

    2009-01-01

    Amphibian populations are known, or thought to be, declining worldwide. Although protected natural areas may act as reservoirs of biological integrity and serve as benchmarks for comparison with unprotected areas, they are not immune from population declines and extinctions and should be monitored. Unfortunately, identifying survey sites and performing long-term fieldwork within such (often remote) areas involves a special set of problems. We used the USDA Natural Resource Conservation Service Soil Survey Geographic (SSURGO) Database to identify, a priori, potential habitat for aquatic-breeding amphibians on North and South Manitou Islands, Sleeping Bear Dunes National Lakeshore, Michigan, and compared the results to those obtained using National Wetland Inventory (NWI) data. The SSURGO approach identified more target sites for surveys than the NWI approach, and it identified more small and ephemeral wetlands. Field surveys used a combination of daytime call surveys, night-time call surveys, and perimeter surveys. We found that sites that would not have been identified with NWI data often contained amphibians and, in one case, contained wetland-breeding species that would not have been found using NWI data. Our technique allows for easy a priori identification of numerous survey sites that might not be identified using other sources of spatial information. We recognize, however, that the most effective site identification and survey techniques will likely use a combination of methods in addition to those described here.

  6. National Aquatic Resource Surveys: Multiple objectives and constraints lead to design complexity

    EPA Science Inventory

    The US Environmental Protection Agency began conducting the National Aquatic resource Surveys (NARS) in 2007 with a national survey of lakes (NLA 2007) followed by rivers and streams in 2008-9 (NRSA 2008), coastal waters in 2010 (NCCA 2010) and wetlands in 2011 (NWCA). The surve...

  7. Design Evolution of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Peabody, Hume; Peters, Carlton; Rodriguez, Juan; McDonald, Carson; Content, David A.; Jackson, Cliff

    2015-01-01

    The design of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) continues to evolve as each design cycle is analyzed. In 2012, two Hubble sized (2.4 m diameter) telescopes were donated to NASA from elsewhere in the Federal Government. NASA began investigating potential uses for these telescopes and identified WFIRST as a mission to benefit from these assets. With an updated, deeper, and sharper field of view than previous design iterations with a smaller telescope, the optical designs of the WFIRST instruments were updated and the mechanical and thermal designs evolved around the new optical layout. Beginning with Design Cycle 3, significant analysis efforts yielded a design and model that could be evaluated for Structural-Thermal-Optical-Performance (STOP) purposes for the Wide Field Imager (WFI) and provided the basis for evaluating the high level observatory requirements. Development of the Cycle 3 thermal model provided some valuable analysis lessons learned and established best practices for future design cycles. However, the Cycle 3 design did include some major liens and evolving requirements which were addressed in the Cycle 4 Design. Some of the design changes are driven by requirements changes, while others are optimizations or solutions to liens from previous cycles. Again in Cycle 4, STOP analysis was performed and further insights into the overall design were gained leading to the Cycle 5 design effort currently underway. This paper seeks to capture the thermal design evolution, with focus on major design drivers, key decisions and their rationale, and lessons learned as the design evolved.

  8. Design Evolution of the Wide Field Infrared Survey Telescope Using Astrophysics Focused Telescope Assets (WFIRST-AFTA) and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Peabody, Hume L.; Peters, Carlton V.; Rodriguez-Ruiz, Juan E.; McDonald, Carson S.; Content, David A.; Jackson, Clifton E.

    2015-01-01

    The design of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) continues to evolve as each design cycle is analyzed. In 2012, two Hubble sized (2.4 m diameter) telescopes were donated to NASA from elsewhere in the Federal Government. NASA began investigating potential uses for these telescopes and identified WFIRST as a mission to benefit from these assets. With an updated, deeper, and sharper field of view than previous design iterations with a smaller telescope, the optical designs of the WFIRST instruments were updated and the mechanical and thermal designs evolved around the new optical layout. Beginning with Design Cycle 3, significant analysis efforts yielded a design and model that could be evaluated for Structural-Thermal-Optical-Performance (STOP) purposes for the Wide Field Imager (WFI) and provided the basis for evaluating the high level observatory requirements. Development of the Cycle 3 thermal model provided some valuable analysis lessons learned and established best practices for future design cycles. However, the Cycle 3 design did include some major liens and evolving requirements which were addressed in the Cycle 4 Design. Some of the design changes are driven by requirements changes, while others are optimizations or solutions to liens from previous cycles. Again in Cycle 4, STOP analysis was performed and further insights into the overall design were gained leading to the Cycle 5 design effort currently underway. This paper seeks to capture the thermal design evolution, with focus on major design drivers, key decisions and their rationale, and lessons learned as the design evolved.

  9. A survey of Utah's public secondary education science teachers to determine their feelings of preparedness to teach engineering design

    NASA Astrophysics Data System (ADS)

    Ames, R. Tyler

    The Next Generation Science Standards were released in 2013 and call for the inclusion of engineering design into the science classroom. This integration of science and engineering is very exciting for many people and groups in both fields involved, but a good bit of uncertainty remains about how prepared science teachers feel to teach engineering design. This study analyzes the history of science standards leading up to the Next Generation Science Standards, establishes key components of the engineering design, and lays the background for the study detailed in this report. A survey was given to several hundred public secondary science teachers in the state of Utah in which respondents were asked to report their feelings of preparedness on several aspects of engineering design. The findings of the study show that Utah teachers do not feel fully prepared to teach engineering design at the present time (2014).

  10. Integrating a multimode design into a national random-digit-dialed telephone survey.

    PubMed

    Hu, Shaohua Sean; Pierannunzi, Carol; Balluz, Lina

    2011-11-01

    The Behavioral Risk Factor Surveillance System (BRFSS) was originally conducted by using a landline telephone survey mode of data collection. To meet challenges of random-digit-dial (RDD) surveys and to ensure data quality and validity, BRFSS is integrating multiple modes of data collection to enhance validity. The survey of adults who use only cellular telephones is now conducted in parallel with ongoing, monthly landline telephone BRFSS data collection, and a mail follow-up survey is being implemented to increase response rates and to assess nonresponse bias. A pilot study in which respondents' physical measurements are taken is being conducted to assess the feasibility of collecting these data for a subsample of adults in 2 states. Physical measures would allow for the adjustment of key self-reported risk factor and health condition estimates and improve the accuracy and usefulness of BRFSS data. This article provides an overview of these new modes of data collection. PMID:22005638

  11. Design and Evaluation of Digital Learning Material to Support Acquisition of Quantitative Problem-Solving Skills within Food Chemistry

    ERIC Educational Resources Information Center

    Diederen, Julia; Gruppen, Harry; Hartog, Rob; Voragen, Alphons G. J.

    2005-01-01

    One of the modules in the course Food Chemistry at Wageningen University (Wageningen, The Netherlands) focuses on quantitative problem-solving skills related to chemical reactions. The intended learning outcomes of this module are firstly, to be able to translate practical food chemistry related problems into mathematical equations and to solve…

  12. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. PMID:25175727

  13. A National Baseline Prevalence Survey of Schistosomiasis in the Philippines Using Stratified Two-Step Systematic Cluster Sampling Design

    PubMed Central

    Leonardo, Lydia; Rivera, Pilarita; Saniel, Ofelia; Villacorte, Elena; Lebanan, May Antonnette; Crisostomo, Bobby; Hernandez, Leda; Baquilod, Mario; Erce, Edgardo; Martinez, Ruth; Velayudhan, Raman

    2012-01-01

    For the first time in the country, a national baseline prevalence survey using a well-defined sampling design such as a stratified two-step systematic cluster sampling was conducted in 2005 to 2008. The purpose of the survey was to stratify the provinces according to prevalence of schistosomiasis such as high, moderate, and low prevalence which in turn would be used as basis for the intervention program to be implemented. The national survey was divided into four phases. Results of the first two phases conducted in Mindanao and the Visayas were published in 2008. Data from the last two phases showed three provinces with prevalence rates higher than endemic provinces surveyed in the first two phases thus changing the overall ranking of endemic provinces at the national level. Age and sex distribution of schistosomiasis remained the same in Luzon and Maguindanao. Soil-transmitted and food-borne helminthes were also recorded in these surveys. This paper deals with the results of the last 2 phases done in Luzon and Maguindanao and integrates all four phases in the discussion. PMID:22518170

  14. [New design of the Health Survey of Catalonia (Spain, 2010-2014): a step forward in health planning and evaluation].

    PubMed

    Alcañiz-Zanón, Manuela; Mompart-Penina, Anna; Guillén-Estany, Montserrat; Medina-Bustos, Antonia; Aragay-Barbany, Josep M; Brugulat-Guiteras, Pilar; Tresserras-Gaju, Ricard

    2014-01-01

    This article presents the genesis of the Health Survey of Catalonia (Spain, 2010-2014) with its semiannual subsamples and explains the basic characteristics of its multistage sampling design. In comparison with previous surveys, the organizational advantages of this new statistical operation include rapid data availability and the ability to continuously monitor the population. The main benefits are timeliness in the production of indicators and the possibility of introducing new topics through the supplemental questionnaire as a function of needs. Limitations consist of the complexity of the sample design and the lack of longitudinal follow-up of the sample. Suitable sampling weights for each specific subsample are necessary for any statistical analysis of micro-data. Accuracy in the analysis of territorial disaggregation or population subgroups increases if annual samples are accumulated. PMID:24472532

  15. Quantitative genetics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...

  16. A survey of ground operations tools developed to simulate the pointing of space telescopes and the design for WISE

    NASA Technical Reports Server (NTRS)

    Fabinsky, Beth

    2006-01-01

    WISE, the Wide Field Infrared Survey Explorer, is scheduled for launch in June 2010. The mission operations system for WISE requires a software modeling tool to help plan, integrate and simulate all spacecraft pointing and verify that no attitude constraints are violated. In the course of developing the requirements for this tool, an investigation was conducted into the design of similar tools for other space-based telescopes. This paper summarizes the ground software and processes used to plan and validate pointing for a selection of space telescopes; with this information as background, the design for WISE is presented.

  17. Quantitative Assessment of a Senge Learning Organization Intervention

    ERIC Educational Resources Information Center

    Kiedrowski, P. Jay

    2006-01-01

    Purpose: To quantitatively assess a Senge learning organization (LO) intervention to determine if it would result in improved employee satisfaction. Design/methodology/approach: A Senge LO intervention in Division 123 of Company ABC was undertaken in 2000. Three employee surveys using likert-scale questions over five years and correlation analysis…

  18. Design of a Mars Airplane Propulsion System for the Aerial Regional-Scale Environmental Survey (ARES) Mission Concept

    NASA Technical Reports Server (NTRS)

    Kuhl, Christopher A.

    2008-01-01

    The Aerial Regional-Scale Environmental Survey (ARES) is a Mars exploration mission concept that utilizes a rocket propelled airplane to take scientific measurements of atmospheric, surface, and subsurface phenomena. The liquid rocket propulsion system design has matured through several design cycles and trade studies since the inception of the ARES concept in 2002. This paper describes the process of selecting a bipropellant system over other propulsion system options, and provides details on the rocket system design, thrusters, propellant tank and PMD design, propellant isolation, and flow control hardware. The paper also summarizes computer model results of thruster plume interactions and simulated flight performance. The airplane has a 6.25 m wingspan with a total wet mass of 185 kg and has to ability to fly over 600 km through the atmosphere of Mars with 45 kg of MMH / MON3 propellant.

  19. Evidence for disaster risk reduction, planning and response: design of the Evidence Aid survey.

    PubMed

    Clarke, Mike; Kayabu, Bonnix

    2011-01-01

    Systematic reviews are now regarded as a key component of the decision making process in health care, and, increasingly, in other areas. This should also be true in disaster risk reduction, planning and response. Since the Indian Ocean tsunami in 2004, The Cochrane Collaboration and others have been working together to strengthen the use and usefulness of systematic reviews in this field, through Evidence Aid. Evidence Aid is conducting a survey to identify the attitudes of those involved in the humanitarian response to natural disasters and other crises towards systematic reviews and research in such settings; their priorities for evidence, and their preferences for how the information should be made accessible. This article contains an outline of the survey instrument, which is available in full from www.EvidenceAid.org. The preliminary findings of the survey will be published in future articles. PMID:22031230

  20. Is the Linear Modeling Technique Good Enough for Optimal Form Design? A Comparison of Quantitative Analysis Models

    PubMed Central

    Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun

    2012-01-01

    How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process. PMID:23258961

  1. Is the linear modeling technique good enough for optimal form design? A comparison of quantitative analysis models.

    PubMed

    Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun

    2012-01-01

    How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process. PMID:23258961

  2. Essential Steps for Web Surveys: A Guide to Designing, Administering and Utilizing Web Surveys for University Decision-Making. Professional File. Number 102, Winter 2006

    ERIC Educational Resources Information Center

    Cheskis-Gold, Rena; Loescher, Ruth; Shepard-Rabadam, Elizabeth; Carroll, Barbara

    2006-01-01

    During the past few years, several Harvard paper surveys were converted to Web surveys. These were high-profile surveys endorsed by the Provost and the Dean of the College, and covered major portions of the university population (all undergraduates, all graduate students, tenured and non-tenured faculty). When planning for these surveys started in…

  3. Survey of waste package designs for disposal of high-level waste/spent fuel in selected foreign countries

    SciTech Connect

    Schneider, K.J.; Lakey, L.T.; Silviera, D.J.

    1989-09-01

    This report presents the results of a survey of the waste package strategies for seven western countries with active nuclear power programs that are pursuing disposal of spent nuclear fuel or high-level wastes in deep geologic rock formations. Information, current as of January 1989, is given on the leading waste package concepts for Belgium, Canada, France, Federal Republic of Germany, Sweden, Switzerland, and the United Kingdom. All but two of the countries surveyed (France and the UK) have developed design concepts for their repositories, but none of the countries has developed its final waste repository or package concept. Waste package concepts are under study in all the countries surveyed, except the UK. Most of the countries have not yet developed a reference concept and are considering several concepts. Most of the information presented in this report is for the current reference or leading concepts. All canisters for the wastes are cylindrical, and are made of metal (stainless steel, mild steel, titanium, or copper). The canister concepts have relatively thin walls, except those for spent fuel in Sweden and Germany. Diagrams are presented for the reference or leading concepts for canisters for the countries surveyed. The expected lifetimes of the conceptual canisters in their respective disposal environment are typically 500 to 1,000 years, with Sweden's copper canister expected to last as long as one million years. Overpack containers that would contain the canisters are being considered in some of the countries. All of the countries surveyed, except one (Germany) are currently planning to utilize a buffer material (typically bentonite) surrounding the disposal package in the repository. Most of the countries surveyed plan to limit the maximum temperature in the buffer material to about 100{degree}C. 52 refs., 9 figs.

  4. A design of strategic alliance based on value chain of surveying and mapping enterprises in China

    NASA Astrophysics Data System (ADS)

    Duan, Hong; Huang, Xianfeng

    2007-06-01

    In this paper, we use value chain and strategic alliance theories to analyzing the surveying and mapping Industry and enterprises. The value chain of surveying and mapping enterprises is highly-contacted but split by administrative interference, the enterprises are common small scale. According to the above things, we consider that establishing a nonequity- Holding strategic alliance based on value chain is an available way, it can not only let the enterprises share the superior resources in different sectors of the whole value chain each other but avoid offending the interests of related administrative departments, by this way, the surveying and mapping enterprises gain development respectively and totally. Then, we give the method to building up the strategic alliance model through parting the value chain and the using advantage of companies in different value chain sectors. Finally, we analyze the internal rule of strategic alliance and prove it is a suitable way to realize the development of surveying and mapping enterprises through game theory.

  5. The TRacking Adolescents' Individual Lives Survey (TRAILS): Design, Current Status, and Selected Findings

    ERIC Educational Resources Information Center

    Ormel, Johan; Oldehinkel, Albertine J.; Sijtsema, Jelle; van Oort, Floor; Raven, Dennis; Veenstra, Rene; Vollebergh, Wilma A. M.; Verhulst, Frank C.

    2012-01-01

    Objectives: The objectives of this study were as follows: to present a concise overview of the sample, outcomes, determinants, non-response and attrition of the ongoing TRacking Adolescents' Individual Lives Survey (TRAILS), which started in 2001; to summarize a selection of recent findings on continuity, discontinuity, risk, and protective…

  6. The Results of the National Heritage Language Survey: Implications for Teaching, Curriculum Design, and Professional Development

    ERIC Educational Resources Information Center

    Carreira, Maria; Kagan, Olga

    2011-01-01

    This article reports on a survey of heritage language learners (HLLs) across different heritage languages (HLs) and geographic regions in the United States. A general profile of HLLs emerges as a student who (1) acquired English in early childhood, after acquiring the HL; (2) has limited exposure to the HL outside the home; (3) has relatively…

  7. The TRacking Adolescents' Individual Lives Survey (TRAILS): Design, Current Status, and Selected Findings

    ERIC Educational Resources Information Center

    Ormel, Johan; Oldehinkel, Albertine J.; Sijtsema, Jelle; van Oort, Floor; Raven, Dennis; Veenstra, Rene; Vollebergh, Wilma A. M.; Verhulst, Frank C.

    2012-01-01

    Objectives: The objectives of this study were as follows: to present a concise overview of the sample, outcomes, determinants, non-response and attrition of the ongoing TRacking Adolescents' Individual Lives Survey (TRAILS), which started in 2001; to summarize a selection of recent findings on continuity, discontinuity, risk, and protective

  8. Student Destination Surveys: Design and Development of an Instrument for Use by TAFE Agencies.

    ERIC Educational Resources Information Center

    Walsh, Lynne

    The development of the student destination survey instrument was a response to an identified need for a tool to measure the effectiveness of Technical and Further Education (TAFE) programs in Australia. The requirement was for a tool that was flexible in accommodating changing course structures and consistent across state and territory boundaries.

  9. German health interview and examination survey for adults (DEGS) - design, objectives and implementation of the first data collection wave

    PubMed Central

    2012-01-01

    Background The German Health Interview and Examination Survey for Adults (DEGS) is part of the recently established national health monitoring conducted by the Robert Koch Institute. DEGS combines a nationally representative periodic health survey and a longitudinal study based on follow-up of survey participants. Funding is provided by the German Ministry of Health and supplemented for specific research topics from other sources. Methods/design The first DEGS wave of data collection (DEGS1) extended from November 2008 to December 2011. Overall, 8152 men and women participated. Of these, 3959 persons already participated in the German National Health Interview and Examination Survey 1998 (GNHIES98) at which time they were 18–79 years of age. Another 4193 persons 18–79 years of age were recruited for DEGS1 in 2008–2011 based on two-stage stratified random sampling from local population registries. Health data and context variables were collected using standardized computer assisted personal interviews, self-administered questionnaires, and standardized measurements and tests. In order to keep survey results representative for the population aged 18–79 years, results will be weighted by survey-specific weighting factors considering sampling and drop-out probabilities as well as deviations between the design-weighted net sample and German population statistics 2010. Discussion DEGS aims to establish a nationally representative data base on health of adults in Germany. This health data platform will be used for continuous health reporting and health care research. The results will help to support health policy planning and evaluation. Repeated cross-sectional surveys will permit analyses of time trends in morbidity, functional capacity levels, disability, and health risks and resources. Follow-up of study participants will provide the opportunity to study trajectories of health and disability. A special focus lies on chronic diseases including asthma, allergies, cardiovascular conditions, diabetes mellitus, and musculoskeletal diseases. Other core topics include vaccine-preventable diseases and immunization status, nutritional deficiencies, health in older age, and the association between health-related behavior and mental health. PMID:22938722

  10. Updated Optimal Designs of Time-Lapse Seismic Surveys for Monitoring CO2 Leakage through Fault Zones

    NASA Astrophysics Data System (ADS)

    Liu, J.; Shang, X.; Sun, Y.; Chen, P.

    2012-12-01

    Cost-effective time-lapse seismic surveys are crucial for long-term monitoring of geologic carbon sequestration. Similar to Shang and Huang (2012), in this study we have numerically modeled time-lapse seismic surveys for monitoring CO2 leakage through fault zones, and designed updated optimal surveys for time-lapse seismic data acquisition using elastic-wave sensitivity analysis. When CO2 was confined in a relatively deep region, our results show that the most desired location for receivers at the surface is at the hanging-wall side of the two fault zones, of high-angle normal faults and reverse faults. The most sensitive places at the surface to the change of different P- and S-wave velocities and density are similar to each other, but are often not sensitive to the source location. When CO2 migrates close to the surface, our modeling suggests that the best region at the surface for time-lapse seismic surveys is very sensitive to the source location and the elastic parameter to be monitored.

  11. Monte Carlo Simulation of the Two-Dimensional Site Percolation Problem for Designing Sensitive and Quantitatively Analyzable Field-Effect Transistors

    NASA Astrophysics Data System (ADS)

    Kasama, Toshihiro; Nakajima, Anri

    2009-10-01

    To investigate the effect of charged substances on the properties of ion-sensitive field-effect transistors (ISFETs), the site percolation property of a finite-size two-dimensional square-lattice system was analyzed by Monte Carlo simulation. We found that the variation in the aspect ratio (width/length) of the channel leads to two important features: the sensitivity of the ISFET is enhanced with a decrease in width and/or an increase in length; however, ISFET having a rather wide and/or short channel produces the best performance in quantitative analysis. The results of this study would be applicable to the design of an ultrasensitive and quantitatively analyzable ISFET.

  12. Design and Specification of Optical Bandpass Filters for Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS)

    NASA Technical Reports Server (NTRS)

    Leviton, Douglas B.; Tsevetanov, Zlatan; Woodruff, Bob; Mooney, Thomas A.

    1998-01-01

    Advanced optical bandpass filters for the Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS) have been developed on a filter-by-filter basis through detailed studies which take into account the instrument's science goals, available optical filter fabrication technology, and developments in ACS's charge-coupled-device (CCD) detector technology. These filters include a subset of filters for the Sloan Digital Sky Survey (SDSS) which are optimized for astronomical photometry using today's charge-coupled-devices (CCD's). In order for ACS to be truly advanced, these filters must push the state-of-the-art in performance in a number of key areas at the same time. Important requirements for these filters include outstanding transmitted wavefront, high transmittance, uniform transmittance across each filter, spectrally structure-free bandpasses, exceptionally high out of band rejection, a high degree of parfocality, and immunity to environmental degradation. These constitute a very stringent set of requirements indeed, especially for filters which are up to 90 mm in diameter. The highly successful paradigm in which final specifications for flight filters were derived through interaction amongst the ACS Science Team, the instrument designer, the lead optical engineer, and the filter designer and vendor is described. Examples of iterative design trade studies carried out in the context of science needs and budgetary and schedule constraints are presented. An overview of the final design specifications for the ACS bandpass and ramp filters is also presented.

  13. Site study plan for EDBH (Engineering Design Boreholes) seismic surveys, Deaf Smith County site, Texas: Revision 1

    SciTech Connect

    Hume, H.

    1987-12-01

    This site study plan describes seismic reflection surveys to run north-south and east-west across the Deaf Smith County site, and intersecting near the Engineering Design Boreholes (EDBH). Both conventional and shallow high-resolution surveys will be run. The field program has been designed to acquire subsurface geologic and stratigraphic data to address information/data needs resulting from Federal and State regulations and Repository program requirements. The data acquired by the conventional surveys will be common-depth- point, seismic reflection data optimized for reflection events that indicate geologic structure near the repository horizon. The data will also resolve the basement structure and shallow reflection events up to about the top of the evaporite sequence. Field acquisition includes a testing phase to check/select parameters and a production phase. The field data will be subjected immediately to conventional data processing and interpretation to determine if there are any anamolous structural for stratigraphic conditions that could affect the choice of the EDBH sites. After the EDBH's have been drilled and logged, including vertical seismic profiling, the data will be reprocessed and reinterpreted for detailed structural and stratigraphic information to guide shaft development. The shallow high-resulition seismic reflection lines will be run along the same alignments, but the lines will be shorter and limited to immediate vicinity of the EDBH sites. These lines are planned to detect faults or thick channel sands that may be present at the EDBH sites. 23 refs. , 7 figs., 5 tabs.

  14. A survey of student opinions on ethical design standards in Taiwan.

    PubMed

    Lee, Yingying; You, Manlai; Yang, Ming-Ying

    2015-04-01

    Design ethics has been offered as a course in undergraduate design programs in Taiwan for over a decade, but research on teaching design ethics and the results of teaching these courses is scant. We conducted two tests to examine (1) the effect of an ethics course, and (2) the differences among the effects of design department, gender, and study year on student opinions regarding ethical design standards (EDSs) at the National Yunlin University of Science and Technology (YunTech) in Taiwan. The participants comprised 934 undergraduates (660 women and 274 men) from the five design departments at YunTech's College of Design from Years 1-4. The results confirmed the effect of an ethics course on student EDS opinions. In addition, we observed significant variations among students according to design departments, suggesting that the characteristics of the design departments also affected students' EDS opinions. The results indicated that gender did not significantly affect design students' EDS opinions; however, students in their early years of study produced higher scores than those in their advanced years of study did, based on the six EDS opinions. The implications of these results for teaching design ethics and future research are discussed in this paper. PMID:24744117

  15. A Meta-analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    NASA Astrophysics Data System (ADS)

    Zhang, Lin

    2013-07-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to different technology features? And how can pieces of qualitative and quantitative results be integrated to achieve a broader understanding of technology designs? To address these issues, this paper proposes a meta-analysis method. Detailed explanations about the structure of the methodology and its scientific mechanism are provided for discussions and suggestions. This paper ends with an in-depth discussion on the concerns and questions that educational researchers might raise, such as how this methodology takes care of learning contexts.

  16. Survey Says

    ERIC Educational Resources Information Center

    McCarthy, Susan K.

    2005-01-01

    Survey Says is a lesson plan designed to teach college students how to access Internet resources for valid data related to the sexual health of young people. Discussion questions based on the most recent available data from two national surveys, the Youth Risk Behavior Surveillance-United States, 2003 (CDC, 2004) and the National Survey of

  17. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    PubMed

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved. PMID:21560804

  18. Literature survey, numerical examples, and recommended design studies for main-coolant pumps. Final report. [PWR; BWR

    SciTech Connect

    Allaire, P.E.; Barrett, L.E.

    1982-06-01

    This report presents an up-to-date literature survey, examples of calculations of seal forces or other pump properties, and recommendations for future work pertaining to primary coolant pumps and primary recirculating pumps in the nuclear power industry. Five main areas are covered: pump impeller forces, fluid annuli, bearings, seals, and rotor calculations. The main conclusion is that forces in pump impellers is perhaps the least well understood area, seals have had some good design work done on them recently, fluid annuli effects are being discussed in the literature, bearing designs are fairly well known, and rotor calculations have been discussed widely in the literature. It should be noted, however, that usually the literature in a given area is not applied to pumps in nuclear power stations. The most immediate need for a combined theoretical and experimental design capability exists in mechanical face seals.

  19. A Quantitative Research Investigation into High School Design and Art Education in a Local High School in Texas

    ERIC Educational Resources Information Center

    Lin, Yi-Hsien

    2013-01-01

    This study was designed to explore the differences between high school teachers with art and science backgrounds in terms of curriculum and student performance in art and design education, federal educational policy, and financial support. The study took place in a local independent school district in Texarkana, Texas. The independent school…

  20. Design of a detection survey for Ostreid herpesvirus-1 using hydrodynamic dispersion models to determine epidemiological units.

    PubMed

    Pande, Anjali; Acosta, Hernando; Brangenberg, Naya Alexis; Keeling, Suzanne Elizabeth

    2015-04-01

    Using Ostreid herpesvirus-1 (OsHV-1) as a case study, this paper considers a survey design methodology for an aquatic animal pathogen that incorporates the concept of biologically independent epidemiological units. Hydrodynamically-modelled epidemiological units are used to divide marine areas into sensible sampling units for detection surveys of waterborne diseases. In the aquatic environment it is difficult to manage disease at the animal level, hence management practices are often aimed at a group of animals sharing a similar risk. Using epidemiological units is a way to define these groups, based on a similar level of probability of exposure based on the modelled potential spread of a viral particle via coastal currents, that can help inform management decisions. PMID:25746929

  1. The Nairobi Birth Survey 1. the study design, the population and outline results.

    PubMed

    Mati, J K; Aggarwal, V P; Lucas, S; Sanghvi, H C; Corkhill, R

    1982-12-01

    The Nairobi Birth Survey was planned with the following objectives: 1) establish the social, obstetric and epidemiological characteristics of the obstetric population of Nairobi, Kenya; 2) examine the pattern and distribution of antenatal and delivery care; and 3) assess the true incidence of stillbirths and 1st 24-hour neonatal deaths, congenital abnormalities and major obstetric complications. The Survey consisted of 1) a study of all stillbirths and 24-hour neonatal deaths over a period of 7 months (March-September 1981), and 2) recording of all births taking place in Nairobi over a 7 week period (June 15-August 4, 1981). During the 7 week period there were 5,293 single births, including 187 perinatal deaths, with a stillbirth rate of 23/1,000 births and a 24-hour neonatal death rate of 12/1,000. The obstetric population was found to be predominantly young, with 57.8% of all mothers being under 25 years of age. Nearly 20% were teenagers. 23% of the mothers were having their 5th or more children at the time of the Survey. In 79.3% of the mothers the antenatal period was uncomplicated. Hypertensive disease in pregnancy was found to be the leading cause of complications, existing in 10.4% of the pregnancies. The majority of the mothers delivered in public institutions. Together with the student midwives, midwives conducted 79.7% of the births. The 3 maternal deaths in this survey give a maternal mortality rate of .56/1,000 deliveries. 701 perinatal deaths occured in the 7 month study, which corresponds to 71.2%. These deaths were mostly associated with complications of labor, including prolonged and difficult labor. In 40.9% of the cases the deaths could have been avoided with appropriate action. In 436 babies that were autopsied, 33 had congenital abnormalities. PMID:12313673

  2. Optical Design of the Camera for Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Chrisp, Michael; Clark, Kristin; Primeau, Brian; Dalpiaz, Michael; Lennon, Joseph

    2015-01-01

    The optical design of the wide field of view refractive camera, 34 degrees diagonal field, for the TESS payload is described. This fast f/1.4 cryogenic camera, operating at -75 C, has no vignetting for maximum light gathering within the size and weight constraints. Four of these cameras capture full frames of star images for photometric searches of planet crossings. The optical design evolution, from the initial Petzval design, took advantage of Forbes aspheres to develop a hybrid design form. This maximized the correction from the two aspherics resulting in a reduction of average spot size by sixty percent in the final design. An external long wavelength pass filter was replaced by an internal filter coating on a lens to save weight, and has been fabricated to meet the specifications. The stray light requirements were met by an extended lens hood baffle design, giving the necessary off-axis attenuation.

  3. Evaluating cost-efficiency and accuracy of hunter harvest survey designs

    USGS Publications Warehouse

    Lukacs, P.M.; Gude, J.A.; Russell, R.E.; Ackerman, B.B.

    2011-01-01

    Effective management of harvested wildlife often requires accurate estimates of the number of animals harvested annually by hunters. A variety of techniques exist to obtain harvest data, such as hunter surveys, check stations, mandatory reporting requirements, and voluntary reporting of harvest. Agencies responsible for managing harvested wildlife such as deer (Odocoileus spp.), elk (Cervus elaphus), and pronghorn (Antilocapra americana) are challenged with balancing the cost of data collection versus the value of the information obtained. We compared precision, bias, and relative cost of several common strategies, including hunter self-reporting and random sampling, for estimating hunter harvest using a realistic set of simulations. Self-reporting with a follow-up survey of hunters who did not report produces the best estimate of harvest in terms of precision and bias, but it is also, by far, the most expensive technique. Self-reporting with no followup survey risks very large bias in harvest estimates, and the cost increases with increased response rate. Probability-based sampling provides a substantial cost savings, though accuracy can be affected by nonresponse bias. We recommend stratified random sampling with a calibration estimator used to reweight the sample based on the proportions of hunters responding in each covariate category as the best option for balancing cost and accuracy. ?? 2011 The Wildlife Society.

  4. Performance of small cluster surveys and the clustered LQAS design to estimate local-level vaccination coverage in Mali

    PubMed Central

    2012-01-01

    Background Estimation of vaccination coverage at the local level is essential to identify communities that may require additional support. Cluster surveys can be used in resource-poor settings, when population figures are inaccurate. To be feasible, cluster samples need to be small, without losing robustness of results. The clustered LQAS (CLQAS) approach has been proposed as an alternative, as smaller sample sizes are required. Methods We explored (i) the efficiency of cluster surveys of decreasing sample size through bootstrapping analysis and (ii) the performance of CLQAS under three alternative sampling plans to classify local VC, using data from a survey carried out in Mali after mass vaccination against meningococcal meningitis group A. Results VC estimates provided by a 10 × 15 cluster survey design were reasonably robust. We used them to classify health areas in three categories and guide mop-up activities: i) health areas not requiring supplemental activities; ii) health areas requiring additional vaccination; iii) health areas requiring further evaluation. As sample size decreased (from 10 × 15 to 10 × 3), standard error of VC and ICC estimates were increasingly unstable. Results of CLQAS simulations were not accurate for most health areas, with an overall risk of misclassification greater than 0.25 in one health area out of three. It was greater than 0.50 in one health area out of two under two of the three sampling plans. Conclusions Small sample cluster surveys (10 × 15) are acceptably robust for classification of VC at local level. We do not recommend the CLQAS method as currently formulated for evaluating vaccination programmes. PMID:23057445

  5. Maximizing Data Quality using Mode Switching in Mixed-Device Survey Design: Nonresponse Bias and Models of Demographic Behavior

    PubMed Central

    Axinn, William G.; Gatny, Heather H.; Wagner, James

    2016-01-01

    Conducting survey interviews on the internet has become an attractive method for lowering data collection costs and increasing the frequency of interviewing, especially in longitudinal studies. However, the advantages of the web mode for studies with frequent re-interviews can be offset by the serious disadvantage of low response rates and the potential for nonresponse bias to mislead investigators. Important life events, such as changes in employment status, relationship changes, or moving can cause attrition from longitudinal studies, producing the possibility of attrition bias. The potential extent of such bias in longitudinal web surveys is not well understood. We use data from the Relationship Dynamics and Social Life (RDSL) study to examine the potential for a mixed-device approach with active mode switching to reduce attrition bias. The RDSL design allows panel members to switch modes by integrating telephone interviewing into a longitudinal web survey with the objective of collecting weekly reports. We found that in this design allowing panel members to switch modes kept more participants in the study compared to a web only approach. The characteristics of persons who ever switched modes are different than those who did not – including not only demographic characteristics, but also baseline characteristics related to pregnancy and time-varying characteristics that were collected after the baseline interview. This was true in multivariate models that control for multiple of these dimensions simultaneously. We conclude that mode options and mode switching is important for the success of longitudinal web surveys to maximize participation and minimize attrition. PMID:26865882

  6. GRAND DESIGN AND FLOCCULENT SPIRALS IN THE SPITZER SURVEY OF STELLAR STRUCTURE IN GALAXIES (S{sup 4}G)

    SciTech Connect

    Elmegreen, Debra Meloy; Yau, Andrew; Elmegreen, Bruce G.; Athanassoula, E.; Bosma, Albert; Helou, George; Sheth, Kartik; Ho, Luis C.; Madore, Barry F.; Menendez-Delmestre, KarIn; Gadotti, Dimitri A.; Knapen, Johan H.; Laurikainen, Eija; Salo, Heikki; Meidt, Sharon E.; Regan, Michael W.; Zaritsky, Dennis; Aravena, Manuel

    2011-08-10

    Spiral arm properties of 46 galaxies in the Spitzer Survey of Stellar Structure in Galaxies (S{sup 4}G) were measured at 3.6 {mu}m, where extinction is small and the old stars dominate. The sample includes flocculent, multiple arm, and grand design types with a wide range of Hubble and bar types. We find that most optically flocculent galaxies are also flocculent in the mid-IR because of star formation uncorrelated with stellar density waves, whereas multiple arm and grand design galaxies have underlying stellar waves. Arm-interarm contrasts increase from flocculent to multiple arm to grand design galaxies and with later Hubble types. Structure can be traced further out in the disk than in previous surveys. Some spirals peak at mid-radius while others continuously rise or fall, depending on Hubble and bar type. We find evidence for regular and symmetric modulations of the arm strength in NGC 4321. Bars tend to be long, high amplitude, and flat-profiled in early-type spirals, with arm contrasts that decrease with radius beyond the end of the bar, and they tend to be short, low amplitude, and exponential-profiled in late Hubble types, with arm contrasts that are constant or increase with radius. Longer bars tend to have larger amplitudes and stronger arms.

  7. Some New Bases and Needs for Interior Design from Environmental Research. A Preliminary Survey.

    ERIC Educational Resources Information Center

    Kleeman, Walter, Jr.

    Research which can form new bases for interior design is being greatly accelerated. Investigations in psychology, anthropology, psychiatry, and biology, as well as interdisciplinary projects, turn up literally hundreds of studies, the results of which will vitally affect interior design. This body of research falls into two parts--(1) human

  8. NATIONAL RESEARCH PROGRAM ON DESIGN-BASED/MODEL-ASSISTED SURVEY METHODOLOGY FOR AQUATIC RESOURCES

    EPA Science Inventory

    We expect to accomplish five major goals with the Program. The first is to extend design-based statistical methodology to cover the unique circumstances encountered in EMAP. The second is to make both existing and newly-developed model-assisted design-based statistical tools m...

  9. An integrated device for magnetically-driven drug release and in situ quantitative measurements: Design, fabrication and testing

    NASA Astrophysics Data System (ADS)

    Bruvera, I. J.; Hernández, R.; Mijangos, C.; Goya, G. F.

    2015-03-01

    We have developed a device capable of remote triggering and in situ quantification of therapeutic drugs, based on magnetically-responsive hydrogels of poly (N-isopropylacrylamide) and alginate (PNiPAAm). The heating efficiency of these hydrogels measured by their specific power absorption (SPA) values showed that the values between 100 and 300 W/g of the material were high enough to reach the lower critical solution temperature (LCST) of the polymeric matrix within few minutes. The drug release through application of AC magnetic fields could be controlled by time-modulated field pulses in order to deliver the desired amount of drug. Using B12 vitamin as a concept drug, the device was calibrated to measure amounts of drug released as small as 25(2)×10-9 g, demonstrating the potential of this device for very precise quantitative control of drug release.

  10. The FMOS-COSMOS Survey of Star-forming Galaxies at z~1.6. III. Survey Design, Performance, and Sample Characteristics

    NASA Astrophysics Data System (ADS)

    Silverman, J. D.; Kashino, D.; Sanders, D.; Kartaltepe, J. S.; Arimoto, N.; Renzini, A.; Rodighiero, G.; Daddi, E.; Zahid, J.; Nagao, T.; Kewley, L. J.; Lilly, S. J.; Sugiyama, N.; Baronchelli, I.; Capak, P.; Carollo, C. M.; Chu, J.; Hasinger, G.; Ilbert, O.; Juneau, S.; Kajisawa, M.; Koekemoer, A. M.; Kovac, K.; Le Fèvre, O.; Masters, D.; McCracken, H. J.; Onodera, M.; Schulze, A.; Scoville, N.; Strazzullo, V.; Taniguchi, Y.

    2015-09-01

    We present a spectroscopic survey of galaxies in the COSMOS field using the Fiber Multi-object Spectrograph (FMOS), a near-infrared instrument on the Subaru Telescope. Our survey is specifically designed to detect the Hα emission line that falls within the H-band (1.6-1.8 μm) spectroscopic window from star-forming galaxies with 1.4 < z < 1.7 and Mstellar ≳ 1010 M⊙. With the high multiplex capability of FMOS, it is now feasible to construct samples of over 1000 galaxies having spectroscopic redshifts at epochs that were previously challenging. The high-resolution mode (R ˜ 2600) effectively separates Hα and [N ii]λ6585, thus enabling studies of the gas-phase metallicity and photoionization state of the interstellar medium. The primary aim of our program is to establish how star formation depends on stellar mass and environment, both recognized as drivers of galaxy evolution at lower redshifts. In addition to the main galaxy sample, our target selection places priority on those detected in the far-infrared by Herschel/PACS to assess the level of obscured star formation and investigate, in detail, outliers from the star formation rate (SFR)—stellar mass relation. Galaxies with Hα detections are followed up with FMOS observations at shorter wavelengths using the J-long (1.11-1.35 μm) grating to detect Hβ and [O iii]λ5008 which provides an assessment of the extinction required to measure SFRs not hampered by dust, and an indication of embedded active galactic nuclei. With 460 redshifts measured from 1153 spectra, we assess the performance of the instrument with respect to achieving our goals, discuss inherent biases in the sample, and detail the emission-line properties. Our higher-level data products, including catalogs and spectra, are available to the community.

  11. Aerodynamic aircraft design methods and their notable applications: Survey of the activity in Japan

    NASA Technical Reports Server (NTRS)

    Fujii, Kozo; Takanashi, Susumu

    1991-01-01

    An overview of aerodynamic aircraft design methods and their recent applications in Japan is presented. A design code which was developed at the National Aerospace Laboratory (NAL) and is in use now is discussed, hence, most of the examples are the result of the collaborative work between heavy industry and the National Aerospace Laboratory. A wide variety of applications in transonic to supersonic flow regimes are presented. Although design of aircraft elements for external flows are the main focus, some of the internal flow applications are also presented. Recent applications of the design code, using the Navier Stokes and Euler equations in the analysis mode, include the design of HOPE (a space vehicle) and Upper Surface Blowing (USB) aircraft configurations.

  12. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  13. A Survey of Applications and Research in Integrated Design Systems Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The initial part of the study was begun with a combination of literature searches, World Wide Web searches, and contacts with individuals and companies who were known to members of our team to have an interest in topics that seemed to be related to our study. There is a long list of such topics, such as concurrent engineering, design for manufacture, life-cycle engineering, systems engineering, systems integration, systems design, design systems, integrated product and process approaches, enterprise integration, integrated product realization, and similar terms. These all capture, at least in part, the flavor of what we describe here as integrated design systems. An inhibiting factor in this inquiry was the absence of agreed terminology for the study of integrated design systems. It is common for the term to be applied to what are essentially augmented Computer-Aided Design (CAD) systems, which are integrated only to the extent that agreements have been reached to attach proprietary extensions to proprietary CAD programs. It is also common for some to use the term integrated design systems to mean a system that applies only, or mainly, to the design phase of a product life cycle. It is likewise common for many of the terms listed earlier to be used as synonyms for integrated design systems. We tried to avoid this ambiguity by adopting the definition of integrated design systems that is implied in the introductory notes that we provided to our contacts, cited earlier. We thus arrived at this definition: Integrated Design Systems refers to the integration of the different tools and processes that comprise the engineering, of complex systems. It takes a broad view of the engineering of systems, to include consideration of the entire product realization process and the product life cycle. An important aspect of integrated design systems is the extent to which they integrate existing, "islands of automation" into a comprehensive design and product realization environment. As the study progressed, we relied increasingly upon a networking approach to lead us to new information. The departure point for such searches often was a government-sponsored project or a company initiative. The advantage of this approach was that short conversations with knowledgeable persons would usually cut through confusion over differences of terminology, thereby somewhat reducing the search space of the study. Even so, it was not until late in our eight-month inquiry that we began to see signs of convergence of the search, in the sense that a number of the latest inquiries began to turn up references to earlier contacts. As suggested above, this convergence often occurred with respect to particular government or company projects.

  14. Longitudinal emittance: An introduction to the concept and survey of measurement techniques including design of a wall current monitor

    NASA Astrophysics Data System (ADS)

    Webber, Robert C.

    1990-10-01

    The properties of charged particle beams associated with the distribution of the particles in energy and in time can be grouped together under the category of longitudinal emittance. This article is intended to provide an intuitive introduction to the concepts of longitudinal emittance; to provide an incomplete survey of methods used to measure this emittance and related properties of bunch length and momentum spread; and to describe the detailed design of a 6 Ghz bandwidth resistive wall current monitor useful for measuring bunch shapes of moderate to high intensity beams. Overall, the article is intended to be broad in scope, in most cases deferring details to cited original papers.

  15. Longitudinal emittance: An introduction to the concept and survey of measurement techniques including design of a wall current monitor

    SciTech Connect

    Webber, R.C.

    1990-03-01

    The properties of charged particle beams associated with the distribution of the particles in energy and in time can be grouped together under the category of longitudinal emittance. This article is intended to provide an intuitive introduction to the concepts longitudinal emittance; to provide an incomplete survey of methods used to measure this emittance and the related properties of bunch length and momentum spread; and to describe the detailed design of a 6 Ghz bandwidth resistive wall current monitor useful for measuring bunch shapes of moderate to high intensity beams. Overall, the article is intended to be broad in scope, in most cases deferring details to cited original papers. 37 refs., 21 figs.

  16. Design of the Nationwide Nursery School Survey on Child Health Throughout the Great East Japan Earthquake

    PubMed Central

    Matsubara, Hiroko; Ishikuro, Mami; Kikuya, Masahiro; Chida, Shoichi; Hosoya, Mitsuaki; Ono, Atsushi; Kato, Noriko; Yokoya, Susumu; Tanaka, Toshiaki; Isojima, Tsuyoshi; Yamagata, Zentaro; Tanaka, Soichiro; Kuriyama, Shinichi; Kure, Shigeo

    2016-01-01

    Background The Great East Japan Earthquake inflicted severe damage on the Pacific coastal areas of northeast Japan. Although possible health impacts on aged or handicapped populations have been highlighted, little is known about how the serious disaster affected preschool children’s health. We conducted a nationwide nursery school survey to investigate preschool children’s physical development and health status throughout the disaster. Methods The survey was conducted from September to December 2012. We mailed three kinds of questionnaires to nursery schools in all 47 prefectures in Japan. Questionnaire “A” addressed nursery school information, and questionnaires “B1” and “B2” addressed individuals’ data. Our targets were children who were born from April 2, 2004, to April 1, 2005 (those who did not experience the disaster during their preschool days) and children who were born from April 2, 2006, to April 1, 2007 (those who experienced the disaster during their preschool days). The questionnaire inquired about disaster experiences, anthropometric measurements, and presence of diseases. Results In total, 3624 nursery schools from all 47 prefectures participated in the survey. We established two nationwide retrospective cohorts of preschool children; 53 747 children who were born from April 2, 2004, to April 1, 2005, and 69 004 children who were born from April 2, 2006, to April 1, 2007. Among the latter cohort, 1003 were reported to have specific personal experiences with the disaster. Conclusions With the large dataset, we expect to yield comprehensive study results about preschool children’s physical development and health status throughout the disaster. PMID:26460382

  17. Design and sample characteristics of the 2005-2008 Nutrition and Health Survey in Taiwan.

    PubMed

    Tu, Su-Hao; Chen, Cheng; Hsieh, Yao-Te; Chang, Hsing-Yi; Yeh, Chih-Jung; Lin, Yi-Chin; Pan, Wen-Harn

    2011-01-01

    The Nutrition and Health Survey in Taiwan (NAHSIT) 2005-2008 was funded by the Department of Health to provide continued assessment of health and nutrition of the people in Taiwan. This household survey collected data from children aged less than 6 years and adults aged 19 years and above, and adopted a three-stage stratified, clustered sampling scheme similar to that used in the NAHSIT 1993-1996. Four samples were produced. One sample with five geographical strata was selected for inference to the whole of Taiwan, while the other three samples, including Hakka, Penghu and mountainous areas were produced for inference to each cultural stratum. A total of 6,189 household interviews and 3,670 health examinations were completed. Interview data included household information, socio-demographics, 24-hour dietary recall, food frequency and habits, dietary and nutritional knowledge, attitudes and behaviors, physical activity, medical history and bone health. Health exam data included anthropometry, blood pressure, physical fitness, bone density, as well as blood and urine collection. Response rate for the household interview was 65%. Of these household interviews, 59% participated in the health exam. Only in a few age subgroups were there significant differences in sex, age, education, or ethnicity distribution between respondents and non-respondents. For the health exam, certain significant differences between participants and non-participants were mostly observed in those aged 19-64 years. The results of this survey will be of benefit to researchers, policy makers and the public to understand and improve the nutrition and health status of pre-school children and adults in Taiwan. PMID:21669592

  18. Survey of Aerothermodynamics Facilities Useful for the Design of Hypersonic Vehicles Using Air-Breathing Propulsion

    NASA Technical Reports Server (NTRS)

    Arnold, James O.; Deiwert, George S.

    1997-01-01

    This paper surveys the use of aerothermodynamic facilities which have been useful in the study of external flows and propulsion aspects of hypersonic, air-breathing vehicles. While the paper is not a survey of all facilities, it covers the utility of shock tunnels and conventional hypersonic blow-down facilities which have been used for hypersonic air-breather studies. The problems confronting researchers in the field of aerothermodynamics are outlined. Results from the T5 GALCIT tunnel for the shock-on lip problem are outlined. Experiments on combustors and short expansion nozzles using the semi-free jet method have been conducted in large shock tunnels. An example which employed the NASA Ames 16-Inch shock tunnel is outlined, and the philosophy of the test technique is described. Conventional blow-down hypersonic wind tunnels are quite useful in hypersonic air-breathing studies. Results from an expansion ramp experiment, simulating the nozzle on a hypersonic air-breather from the NASA Ames 3.5 Foot Hypersonic wind tunnel are summarized. Similar work on expansion nozzles conducted in the NASA Langley hypersonic wind tunnel complex is cited. Free-jet air-frame propulsion integration and configuration stability experiments conducted at Langley in the hypersonic wind tunnel complex on a small generic model are also summarized.

  19. Survey and analysis of research on supersonic drag-due-to-lift minimization with recommendations for wing design

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Mann, Michael J.

    1992-01-01

    A survey of research on drag-due-to-lift minimization at supersonic speeds, including a study of the effectiveness of current design and analysis methods was conducted. The results show that a linearized theory analysis with estimated attainable thrust and vortex force effects can predict with reasonable accuracy the lifting efficiency of flat wings. Significantly better wing performance can be achieved through the use of twist and camber. Although linearized theory methods tend to overestimate the amount of twist and camber required for a given application and provide an overly optimistic performance prediction, these deficiencies can be overcome by implementation of recently developed empirical corrections. Numerous examples of the correlation of experiment and theory are presented to demonstrate the applicability and limitations of linearized theory methods with and without empirical corrections. The use of an Euler code for the estimation of aerodynamic characteristics of a twisted and cambered wing and its application to design by iteration are discussed.

  20. Ultradeep IRAC Imaging Over the HUDF and GOODS-South: Survey Design and Imaging Data Release

    NASA Astrophysics Data System (ADS)

    Labbé, I.; Oesch, P. A.; Illingworth, G. D.; van Dokkum, P. G.; Bouwens, R. J.; Franx, M.; Carollo, C. M.; Trenti, M.; Holden, B.; Smit, R.; González, V.; Magee, D.; Stiavelli, M.; Stefanon, M.

    2015-12-01

    The IRAC ultradeep field and IRAC Legacy over GOODS programs are two ultradeep imaging surveys at 3.6 and 4.5 μm with the Spitzer Infrared Array Camera (IRAC). The primary aim is to directly detect the infrared light of reionization epoch galaxies at z > 7 and to constrain their stellar populations. The observations cover the Hubble Ultra Deep Field (HUDF), including the two HUDF parallel fields, and the CANDELS/GOODS-South, and are combined with archival data from all previous deep programs into one ultradeep data set. The resulting imaging reaches unprecedented coverage in IRAC 3.6 and 4.5 μm ranging from >50 hr over 150 arcmin2, >100 hr over 60 sq arcmin2, to ˜200 hr over 5-10 arcmin2. This paper presents the survey description, data reduction, and public release of reduced mosaics on the same astrometric system as the CANDELS/GOODS-South Wide Field Camera 3 (WFC3) data. To facilitate prior-based WFC3+IRAC photometry, we introduce a new method to create high signal-to-noise PSFs from the IRAC data and reconstruct the complex spatial variation due to survey geometry. The PSF maps are included in the release, as are registered maps of subsets of the data to enable reliability and variability studies. Simulations show that the noise in the ultradeep IRAC images decreases approximately as the square root of integration time over the range 20-200 hr, well below the classical confusion limit, reaching 1σ point-source sensitivities as faint as 15 nJy (28.5 AB) at 3.6 μm and 18 nJy (28.3 AB) at 4.5 μm. The value of such ultradeep IRAC data is illustrated by direct detections of z = 7-8 galaxies as faint as HAB = 28. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained from the data archive at the Space Telescope Science Institute. STScI is operated by the Association of Universities for Research in Astronomy, Inc. under NASA contract NAS 5-26555. Based on observations made with the Spitzer Space Telescope, which is operated by the Jet Propulsion Laboratory, California Institute of Technology under a contract with NASA. Support for this work was provided by NASA through an award issued by JPL/Caltech.

  1. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    USGS Publications Warehouse

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are becoming increasingly popular for group size modeling. Choosing appropriate statistical distributions for modeling flock size data is fundamental to accurately estimating population summaries, determining required survey effort, and assessing and propagating uncertainty through decision-making processes.

  2. Spectacle Design Preferences among Chinese Primary and Secondary Students and Their Parents: A Qualitative and Quantitative Study

    PubMed Central

    Zhou, Zhongqiang; Kecman, Maja; Chen, Tingting; Liu, Tianyu; Jin, Ling; Chen, Shangji; Chen, Qianyun; He, Mingguang; Silver, Josh; Moore, Bruce; Congdon, Nathan

    2014-01-01

    Purpose To identify the specific characteristics making glasses designs, particularly those compatible with adjustable glasses, more or less appealing to Chinese children and their parents. Patients and Methods Primary and secondary school children from urban and rural China with < = −1.00 diopters of bilateral myopia and their parents ranked four conventional-style frames identified by local optical shops as popular versus four child-specific frames compatible with adjustable spectacles. Scores based on the proportion of maximum possible ranking were computed for each style. Selected children and their parents also participated in Focus Groups (FGs) discussing spectacle design preference. Recordings were transcribed and coded by two independents reviewers using NVivo software. Results Among 136 urban primary school children (age range 9–11 years), 290 rural secondary school children (11–17 years) and 16 parents, all adjustable-style frames (scores on 0–100 scale 25.7–62.4) were ranked behind all conventional frames (63.0–87.5). For eight FGs including 12 primary children, 26 secondary children and 16 parents, average kappa values for NVivo coding were 0.81 (students) and 0.70 (parents). All groups agreed that the key changes to make adjustable designs more attractive were altering the round lenses to rectangular or oval shapes and adding curved earpieces for more stable wear. The thick frames of the adjustable designs were considered stylish, and children indicated they would wear them if the lens shape were modified. Conclusions Current adjustable lens designs are unattractive to Chinese children and their parents, though this study identified specific modifications which would make them more appealing. PMID:24594799

  3. Design and construction of instrument rotator for the Sloan Digital Sky Survey (SDSS) telescope

    NASA Astrophysics Data System (ADS)

    Leger, R. French; Long, Dan; Carey, Larry N.; Owen, Russell E.; Sigmund, Walter

    2003-02-01

    This paper will describe the concerns, parameters and restrictions in the design and construction of the instrument rotator used on the SDSS telescope. The rotator provides support for two 600 Lb. Spectrographs, through all axes motion, without causing harmful radial moments to be translated to its inner ring which supports the mosaic imaging camera. This is accomplished using an outer-inner ring design. The outer ring is a thin-walled box structure incorporating the drive surface and is attached to the inner ring through a steel membrane. This rotator design requires the telescope"s primary support structure to provide final structural integrity. Due to this feature, a special fixture was needed to transport the rotator from the vendor and to install it onto the telescope. Positional accuracy and feedback is provided by an optical tape and read-head system manufactured by Heidenhain and attached to the inner ring. The drive motor was designed to use the same motor as those employed for the other two telescope axes, thus minimizing the spare-parts inventory and maintenance. Its drive pinion is of a pinch design, with the pinion axis parallel to rotator radius. A great deal of attention and planning was required in the construction of the box frame outer ring and the induction heat-treating of the drive surface. Drive surface tolerances were maintained within +/-0.001 inches, and internal stress cracks from heat-treating were minimal.

  4. Advanced power generation systems for the 21st Century: Market survey and recommendations for a design philosophy

    SciTech Connect

    Andriulli, J.B.; Gates, A.E.; Haynes, H.D.; Klett, L.B.; Matthews, S.N.; Nawrocki, E.A.; Otaduy, P.J.; Scudiere, M.B.; Theiss, T.J.; Thomas, J.F.; Tolbert, L.M.; Yauss, M.L.; Voltz, C.A.

    1999-11-01

    The purpose of this report is to document the results of a study designed to enhance the performance of future military generator sets (gen-sets) in the medium power range. The study includes a market survey of the state of the art in several key component areas and recommendations comprising a design philosophy for future military gen-sets. The market survey revealed that the commercial market is in a state of flux, but it is currently or will soon be capable of providing the technologies recommended here in a cost-effective manner. The recommendations, if implemented, should result in future power generation systems that are much more functional than today's gen-sets. The number of differing units necessary (both family sizes and frequency modes) to cover the medium power range would be decreased significantly, while the weight and volume of each unit would decrease, improving the transportability of the power source. Improved fuel economy and overall performance would result from more effective utilization of the prime mover in the generator. The units would allow for more flexibility and control, improved reliability, and more effective power management in the field.

  5. Design and status of the optical corrector for the DES survey instrument

    NASA Astrophysics Data System (ADS)

    Doel, P.; Abbott, T.; Antonik, M.; Bernstein, R.; Bigelow, B.; Brooks, D.; Cease, H.; DePoy, D. L.; Flaugher, B.; Gladders, M.; Gutierrez, G.; Kent, S.; Stefanik, A.; Walker, A.; Worswick, S.

    2008-07-01

    The DECam instrument, for the 4m Blanco telescope at CTIO, is a 5 lens element wide field camera giving a 2.2 degree diameter field of view. The lenses are large, with the biggest being 980mm in diameter, and this poses challenges in mounting and alignment. This paper reports the status of the production of the optics for the DECam wide field imager Also presented are the design and finite element modelling of the cell design for the 5 lenses of the imager along with the proposed alignment process.

  6. Disposable surface plasmon resonance aptasensor with membrane-based sample handling design for quantitative interferon-gamma detection.

    PubMed

    Chuang, Tsung-Liang; Chang, Chia-Chen; Chu-Su, Yu; Wei, Shih-Chung; Zhao, Xi-hong; Hsueh, Po-Ren; Lin, Chii-Wann

    2014-08-21

    ELISA and ELISPOT methods are utilized for interferon-gamma (IFN-γ) release assays (IGRAs) to detect the IFN-γ secreted by T lymphocytes. However, the multi-step protocols of the assays are still performed with laboratory instruments and operated by well-trained people. Here, we report a membrane-based microfluidic device integrated with a surface plasmon resonance (SPR) sensor to realize an easy-to-use and cost effective multi-step quantitative analysis. To conduct the SPR measurements, we utilized a membrane-based SPR sensing device in which a rayon membrane was located 300 μm under the absorbent pad. The basic equation covering this type of transport is based on Darcy's law. Furthermore, the concentration of streptavidin delivered from a sucrose-treated glass pad placed alongside the rayon membrane was controlled in a narrow range (0.81 μM ± 6%). Finally, the unbound molecules were removed by a washing buffer that was pre-packed in the reservoir of the chip. Using a bi-functional, hairpin-shaped aptamer as the sensing probe, we specifically detected the IFN-γ and amplified the signal by binding the streptavidin. A high correlation coefficient (R(2) = 0.995) was obtained, in the range from 0.01 to 100 nM. A detection limit of 10 pM was achieved within 30 min. Thus, the SPR assay protocols for IFN-γ detection could be performed using this simple device without an additional pumping system. PMID:24931052

  7. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  8. Siphon penstock installations at hydroelectric projects: A survey of design, construction and operating experience

    SciTech Connect

    Burgoine, D.; Rodrigue, P.; Tarbell, J.C.; Acres International Corp., Amherst, NY . Mechanical Engineering Dept.; Acres International Corp., Amherst, NY )

    1989-01-01

    There can be advantages to using siphon penstocks at small hydro projects, particularly those constructed at existing dams. One problem, however, is a lack of documentation of siphon penstock installations. The design considerations, construction and operating aspects of siphon penstock installations are described here. 4 figs., 1 tab.

  9. Survey of piloting factors in V/STOL aircraft with implications for flight control system design

    NASA Technical Reports Server (NTRS)

    Ringland, R. F.; Craig, S. J.

    1977-01-01

    Flight control system design factors involved for pilot workload relief are identified. Major contributors to pilot workload include configuration management and control and aircraft stability and response qualities. A digital fly by wire stability augmentation, configuration management, and configuration control system is suggested for reduction of pilot workload during takeoff, hovering, and approach.

  10. A Survey of Career Guidance Needs of Industrial Design Students in Taiwanese Universities

    ERIC Educational Resources Information Center

    Yang, Ming-Ying; You, Manlai

    2010-01-01

    School pupils in Taiwan spend most of their time in studying and having examinations, and consequently many of them decide what major to study in universities rather hastily. Industrial design (ID) programs in universities nowadays recruit students from general and vocational senior high schools through a variety of channels. As a consequence, ID…

  11. Research design and statistical methods in Indian medical journals: a retrospective survey.

    PubMed

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems. PMID:25856194

  12. Research Design and Statistical Methods in Indian Medical Journals: A Retrospective Survey

    PubMed Central

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S.; Mayya, Shreemathi S.

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems. PMID:25856194

  13. Predictors of intentions to quit smoking in Aboriginal tobacco smokers of reproductive age in regional New South Wales (NSW), Australia: quantitative and qualitative findings of a cross-sectional survey

    PubMed Central

    Gould, Gillian Sandra; Watt, Kerrianne; McEwen, Andy; Cadet-James, Yvonne; Clough, Alan R

    2015-01-01

    Objectives To assess the predictors of intentions to quit smoking in a community sample of Aboriginal smokers of reproductive age, in whom smoking prevalence is slow to decline. Design, setting and participants A cross-sectional survey involved 121 Aboriginal smokers, aged 18–45 years from January to May 2014, interviewed at community events on the Mid-North Coast NSW. Qualitative and quantitative data were collected on smoking and quitting attitudes, behaviours and home smoking rules. Perceived efficacy for quitting, and perceived threat from smoking, were uniquely assessed with a validated Risk Behaviour Diagnosis (RBD) Scale. Main outcome measures Logistic regression explored the impact of perceived efficacy, perceived threat and consulting previously with a doctor or health professional (HP) on self-reported intentions to quit smoking, controlling for potential confounders, that is, protection responses and fear control responses, home smoking rules, gender and age. Participants’ comments regarding smoking and quitting were investigated via inductive analysis, with the assistance of Aboriginal researchers. Results Two-thirds of smokers intended to quit within 3 months. Perceived efficacy (OR=4.8; 95% CI 1.78 to 12.93) and consulting previously with a doctor/HP about quitting (OR=3.82; 95% CI 1.43 to 10.2) were significant predictors of intentions to quit. ‘Smoking is not doing harm right now’ was inversely associated with quit intentions (OR=0.25; 95% CI 0.08 to 0.8). Among those who reported making a quit attempt, after consulting with a doctor/HP, 40% (22/60) rated the professional support received as low (0–2/10). Qualitative themes were: the negatives of smoking (ie, disgust, regret, dependence and stigma), health effects and awareness, quitting, denial, ‘smoking helps me cope’ and social aspects of smoking. Conclusions Perceived efficacy and consulting with a doctor/HP about quitting may be important predictors of intentions to quit smoking in Aboriginal smokers of reproductive age. Professional support was generally perceived to be low; thus, it could be improved for these Aboriginal smokers. Aboriginal participants expressed strong sentiments about smoking and quitting. PMID:25770232

  14. A survey on the design of multiprocessing systems for artificial intelligence applications

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Li, Guo Jie

    1989-01-01

    Some issues in designing computers for artificial intelligence (AI) processing are discussed. These issues are divided into three levels: the representation level, the control level, and the processor level. The representation level deals with the knowledge and methods used to solve the problem and the means to represent it. The control level is concerned with the detection of dependencies and parallelism in the algorithmic and program representations of the problem, and with the synchronization and sheduling of concurrent tasks. The processor level addresses the hardware and architectural components needed to evaluate the algorithmic and program representations. Solutions for the problems of each level are illustrated by a number of representative systems. Design decisions in existing projects on AI computers are classed into top-down, bottom-up, and middle-out approaches.

  15. Survey of Aerothermodynamics Facilities Useful for the Design of Hypersonic Vehicles Using Air-Breathing Propulsion

    NASA Technical Reports Server (NTRS)

    Arnold, James O.; Deiwert, G. S.

    1997-01-01

    The dream of producing an air-breathing, hydrogen fueled, hypervelocity aircraft has been before the aerospace community for decades. However, such a craft has not yet been realized, even in an experimental form. Despite the simplicity and beauty of the concept, many formidable problems must be overcome to make this dream a reality. This paper summarizes the aero/aerothermodynamic issues that must be addressed to make the dream a reality and discusses how aerothermodynamics facilities and their modem companion, real-gas computational fluid dynamics (CFD), can help solve the problems blocking the way to realizing the dream. The approach of the paper is first to outline the concept of an air-breathing hypersonic vehicle and then discuss the nose-to-tail aerothermodynamics issues and special aerodynamic problems that arise with such a craft. Then the utility of aerothermodynamic facilities and companion CFD analysis is illustrated by reviewing results from recent United States publications wherein these problems have been addressed. Papers selected for the discussion have k e n chosen such that the review will serve to survey important U.S. aero/aerothermodynamic real gas and conventional wind tunnel facilities that are useful in the study of hypersonic, hydrogen propelled hypervelocity vehicles.

  16. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    Flowfield rake was designed to quantify the flowfield for inlet research underneath NASA DFRC s F-15B airplane. Detailed loads and stress analysis performed using CFD and empirical methods to assure structural integrity. Calibration data were generated through wind tunnel testing of the rake. Calibration algorithm was developed to determine the local Mach and flow angularity at each probe. RAGE was flown November, 2008. Data is currently being analyzed.

  17. Mail and Web Surveys: A Comparison of Demographic Characteristics and Response Quality When Respondents Self-Select the Survey Administration Mode

    ERIC Educational Resources Information Center

    Mackety, Dawn M.

    2007-01-01

    The purpose of this study was to use a nonexperimental, quantitative design to compare mail and web surveys with survey mode self-selection at two data collection waves. Research questions examined differences and predictabilities among demographics (gender, ethnicity, age, and professional employment) and response quality (pronoun use, item…

  18. Experimental design approach for the optimisation of a HPLC-fluorimetric method for the quantitation of the angiotensin II receptor antagonist telmisartan in urine.

    PubMed

    Torrealday, N; González, L; Alonso, R M; Jiménez, R M; Ortiz Lastra, E

    2003-08-01

    A high performance liquid chromatographic method with fluorimetric detection has been developed for the quantitation of the angiotensin II receptor antagonist (ARA II) 4-((2-n-propyl-4-methyl-6-(1-methylbenzimidazol-2-yl)-benzimidazol-1-yl)methyl)biphenyl-2-carboxylic acid (telmisartan) in urine, using a Novapak C18 column 3.9 x 150 mm, 4 microm. The mobile phase consisted of a mixture acetonitrile-phosphate buffer (pH 6.0, 5 mM) (45:55, v/v) pumped at a flow rate of 0.5 ml min(-1). Effluent was monitored at excitation and emission wavelengths of 305 and 365 nm, respectively. Separation was carried out at room temperature. Chromatographic variables were optimised by means of experimental design. A clean-up step was used for urine samples consisting of a solid-phase extraction procedure with C8 cartridges and methanol as eluent. This method proved to be accurate (RE from -12 to 6%), precise (intra- and inter-day coefficients of variation (CV) were lower than 8%) and sensitive enough (limit of quantitation (LOQ), ca. 1 microg l(-1)) to be applied to the determination of the active drug in urine samples obtained from hypertensive patients. Concentration levels of telmisartan at different time intervals (from 0 up to 36 h after oral intake) were monitored. PMID:12899971

  19. A Study of Program Management Procedures in the Campus-Based and Basic Grant Programs. Technical Report No. 1: Sample Design, Student Survey Yield and Bias.

    ERIC Educational Resources Information Center

    Puma, Michael J.; Ellis, Richard

    Part of a study of program management procedures in the campus-based and Basic Educational Opportunity Grant programs reports on the design of the site visit component of the study and the results of the student survey, both in terms of the yield obtained and the quality of the data. Chapter 2 describes the design of sampling methodology employed…

  20. Designing for Dissemination Among Public Health Researchers: Findings From a National Survey in the United States

    PubMed Central

    Jacobs, Julie A.; Tabak, Rachel G.; Hoehner, Christine M.; Stamatakis, Katherine A.

    2013-01-01

    Objectives. We have described the practice of designing for dissemination among researchers in the United States with the intent of identifying gaps and areas for improvement. Methods. In 2012, we conducted a cross-sectional study of 266 researchers using a search of the top 12 public health journals in PubMed and lists available from government-sponsored research. The sample involved scientists at universities, the National Institutes of Health, and the Centers for Disease Control and Prevention in the United States. Results. In the pooled sample, 73% of respondents estimated they spent less than 10% of their time on dissemination. About half of respondents (53%) had a person or team in their unit dedicated to dissemination. Seventeen percent of all respondents used a framework or theory to plan their dissemination activities. One third of respondents (34%) always or usually involved stakeholders in the research process. Conclusions. The current data and the existing literature suggest considerable room for improvement in designing for dissemination. PMID:23865659

  1. Methodological survey of designed uneven randomization trials (DU-RANDOM): a protocol

    PubMed Central

    2014-01-01

    Background Although even randomization (that is, approximately 1:1 randomization ratio in study arms) provides the greatest statistical power, designed uneven randomization (DUR), (for example, 1:2 or 1:3) is used to increase participation rates. Until now, no convincing data exists addressing the impact of DUR on participation rates in trials. The objective of this study is to evaluate the epidemiology and to explore factors associated with DUR. Methods We will search for reports of RCTs published within two years in 25 general medical journals with the highest impact factor according to the Journal Citation Report (JCR)-2010. Teams of two reviewers will determine eligibility and extract relevant information from eligible RCTs in duplicate and using standardized forms. We will report the prevalence of DUR trials, the reported reasons for using DUR, and perform a linear regression analysis to estimate the association between the randomization ratio and the associated factors, including participation rate, type of informed consent, clinical area, and so on. Discussion A clearer understanding of RCTs with DUR and its association with factors in trials, for example, participation rate, can optimize trial design and may have important implications for both researchers and users of the medical literature. PMID:24456965

  2. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    The Propulsion Flight Test Fixture at the NASA Dryden Flight Research Center is a unique test platform available for use on NASA's F-15B aircraft, tail number 836, as a modular host for a variety of aerodynamics and propulsion research. For future flight data from this platform to be valid, more information must be gathered concerning the quality of the airflow underneath the body of the F-15B at various flight conditions, especially supersonic conditions. The flow angularity and Mach number must be known at multiple locations on any test article interface plane for measurement data at these locations to be valid. To determine this prerequisite information, flight data will be gathered in the Rake Airflow Gauge Experiment using a custom-designed flowfield rake to probe the airflow underneath the F-15B at the desired flight conditions. This paper addresses the design considerations of the rake and probe assembly, including the loads and stress analysis using analytical methods, computational fluid dynamics, and finite element analysis. It also details the flow calibration procedure, including the completed wind-tunnel test and posttest data reduction, calibration verification, and preparation for flight-testing.

  3. Hybrid optimization methodology of variable densities mesh model for the axial supporting design of wide-field survey telescope

    NASA Astrophysics Data System (ADS)

    Wang, Hairen; Lou, Zheng; Qian, Yuan; Zheng, Xianzhong; Zuo, Yingxi

    2016-03-01

    The optimization of a primary mirror support system is one of the most critical problems in the design of large telescopes. Here, we propose a hybrid optimization methodology of variable densities mesh model (HOMVDMM) for the axial supporting design, which has three key steps: (1) creating a variable densities mesh model, which will partition the mirror into several sparse mesh areas and several dense mesh areas; (2) global optimization based on the zero-order optimization method for the support of primary mirror with a large tolerance; (3) based on the optimization results of the second step, further optimization with first-order optimization method in dense mesh areas by a small tolerance. HOMVDMM exploits the complementary merits of both the zero- and first-order optimizations, with the former in global scale and the latter in small scale. As an application, the axial support of the primary mirror of the 2.5-m wide-field survey telescope (WFST) is optimized by HOMVDMM. These three designs are obtained via a comparative study of different supporting points including 27 supporting points, 39 supporting points, and 54 supporting points. Their residual half-path length errors are 28.78, 9.32, and 5.29 nm. The latter two designs both meet the specification of WFST. In each of the three designs, a global optimization value with high accuracy will be obtained in an hour on an ordinary PC. As the results suggest, the overall performance of HOMVDMM is superior to the first-order optimization method as well as the zero-order optimization method.

  4. Industry survey of wafer fab reticle quality control strategies in the 90nm-45nm design-rule age

    NASA Astrophysics Data System (ADS)

    Dover, Russell

    2007-10-01

    Reticle quality control in wafer fabs is different from quality control in mask shops. Mask shop requirements are typically inspectability of mask-type, resolution and sensitivity, with the latter usually being the most important. Mask shop sensitivity requirements are also fairly absolute. All defects or imperfections of a certain specification have to be found 100 percent of the time, every time. Wafer fab requirements are an interplay between inspectability, sensitivity, and the economic cost of inspection versus the economic risk of not inspecting. Early warning and defect signatures versus absolute capture of all defects is a key distinction between wafer fabs and mask shops. In order to better understand the different strategies and approaches taken by wafer fabs for reticle quality control an industry-wide benchmark survey of leading wafer fabs was undertaken. This paper summarizes the results while retaining the different wafer fabs' anonymity and confidentiality. The approach taken for the survey was specifically designed to be impartial and independent of any tools, solutions or applications available from KLA-Tencor.

  5. The Norwegian Offender Mental Health and Addiction Study – Design and Implementation of a National Survey and Prospective Cohort Study

    PubMed Central

    Bukten, Anne; Lund, Ingunn Olea; Rognli, Eline Borger; Stavseth, Marianne Riksheim; Lobmaier, Philipp; Skurtveit, Svetlana; Clausen, Thomas; Kunøe, Nikolaj

    2015-01-01

    The Norwegian prison inmates are burdened by problems before they enter prison. Few studies have managed to assess this burden and relate it to what occurs for the inmates once they leave the prison. The Norwegian Offender Mental Health and Addiction (NorMA) study is a large-scale longitudinal cohort study that combines national survey and registry data in order to understand mental health, substance use, and criminal activity before, during, and after custody among prisoners in Norway. The main goal of the study is to describe the criminal and health-related trajectories based on both survey and registry linkage information. Data were collected from 1,499 inmates in Norwegian prison facilities during 2013–2014. Of these, 741 inmates provided a valid personal identification number and constitute a cohort that will be examined retrospectively and prospectively, along with data from nationwide Norwegian registries. This study describes the design, procedures, and implementation of the ongoing NorMA study and provides an outline of the initial data. PMID:26648732

  6. The C-Band All-Sky Survey (C-BASS): design and implementation of the northern receiver

    NASA Astrophysics Data System (ADS)

    King, O. G.; Jones, Michael E.; Blackhurst, E. J.; Copley, C.; Davis, R. J.; Dickinson, C.; Holler, C. M.; Irfan, M. O.; John, J. J.; Leahy, J. P.; Leech, J.; Muchovej, S. J. C.; Pearson, T. J.; Stevenson, M. A.; Taylor, Angela C.

    2014-03-01

    The C-Band All-Sky Survey is a project to map the full sky in total intensity and linear polarization at 5 GHz. The northern component of the survey uses a broad-band single-frequency analogue receiver fitted to a 6.1-m telescope at the Owens Valley Radio Observatory in California, USA. The receiver architecture combines a continuous-comparison radiometer and a correlation polarimeter in a single receiver for stable simultaneous measurement of both total intensity and linear polarization, using custom-designed analogue receiver components. The continuous-comparison radiometer measures the temperature difference between the sky and temperature-stabilized cold electrical reference loads. A cryogenic front-end is used to minimize receiver noise, with a system temperature of ≈30 K in both linear polarization and total intensity. Custom cryogenic notch filters are used to counteract man-made radio frequency interference. The radiometer 1/f noise is dominated by atmospheric fluctuations, while the polarimeter achieves a 1/f noise knee frequency of 10 mHz, similar to the telescope azimuthal scan frequency.

  7. Availability and Structure of Ambulatory Rehabilitation Services: A Survey of Hospitals with Designated Rehabilitation Beds in Ontario, Canada

    PubMed Central

    Passalent, Laura A.; Cott, Cheryl A.

    2008-01-01

    Purpose: To determine the degree to which ambulatory physical therapy (PT), occupational therapy (OT), and speech language pathology (SLP) services are available in hospitals with designated rehabilitation beds (DRBs) in Ontario, and to explore the structure of delivery and funding among services that exist. Methods: Questions regarding ambulatory services were included in the System Integration and Change (SIC) survey sent to all hospitals participating in the Hospital Report 2005: Rehabilitation initiative. Results: The response rate was 75.9% (41 of 54 hospitals). All hospitals surveyed provide some degree of ambulatory rehabilitation services, but the nature of these services varies according to rehabilitation client groups (RCGs). The majority of hospitals continue to deliver services through their employees rather than by contracting out or by creating for-profit subsidiary clinics, but an increasing proportion is accessing private sources to finance ambulatory services. Conclusions: Most hospitals with DRBs provide some degree of ambulatory rehabilitation services. Privatization of delivery is not widespread in these facilities. PMID:20145757

  8. Quantitative analysis

    PubMed Central

    Nevin, John A.

    1984-01-01

    Quantitative analysis permits the isolation of invariant relations in the study of behavior. The parameters of these relations can serve as higher-order dependent variables in more extensive analyses. These points are illustrated by reference to quantitative descriptions of performance maintained by concurrent schedules, multiple schedules, and signal-detection procedures. Such quantitative descriptions of empirical data may be derived from mathematical theories, which in turn can lead to novel empirical analyses so long as their terms refer to behavioral and environmental events. Thus, quantitative analysis is an integral aspect of the experimental analysis of behavior. PMID:16812400

  9. TOPoS. I. Survey design and analysis of the first sample

    NASA Astrophysics Data System (ADS)

    Caffau, E.; Bonifacio, P.; Sbordone, L.; François, P.; Monaco, L.; Spite, M.; Plez, B.; Cayrel, R.; Christlieb, N.; Clark, P.; Glover, S.; Klessen, R.; Koch, A.; Ludwig, H.-G.; Spite, F.; Steffen, M.; Zaggia, S.

    2013-12-01

    Context. The metal-weak tail of the metallicity distribution function (MDF) of the Galactic Halo stars contains crucial information on the formation mode of the first generation of stars. To determine this observationally, it is necessary to observe large numbers of extremely metal-poor stars. Aims: We present here the Turn-Off Primordial Stars survey (TOPoS) that is conducted as an ESO Large Programme at the VLT. This project has four main goals: (i) to understand the formation of low-mass stars in a low-metallicity gas: determine the metal-weak tail of the halo MDF below [M/H] = -3.5; in particular, we aim at determining the critical metallicity, that is the lowest metallicity sufficient for the formation of low-mass stars; (ii) to determine in extremely metal-poor stars the relative abundances of the elements that are the signature of the massive first stars; (iii) to determine the trend of the lithium abundance at the time when the Galaxy formed; and (iv) to derive the fraction of C-enhanced extremely metal-poor stars with respect to normal extremely metal-poor stars. The large number of stars observed in the SDSS provides a good sample of candidate stars at extremely low metallicity. Methods: Candidates with turn-off colours down to magnitude g = 20 were selected from the low-resolution spectra of SDSS by means of an automated procedure. X-Shooter has the potential of performing the necessary follow-up spectroscopy, providing accurate metallicities and abundance ratios for several key elements for these stars. Results: We present here the stellar parameters of the first set of stars. The nineteen stars range in iron abundance between -4.1 and -2.9 dex relative to the Sun. Two stars have a high radial velocity and, according to our estimate of their kinematics, appear to be marginally bound to the Galaxy and are possibly accreted from another galaxy. Based on observations obtained at ESO Paranal Observatory, GTO programme 189.D-0165(A).

  10. Design of a Mars Airplane Propulsion System for the Aerial Regional-Scale Environmental Survey (ARES) Mission Concept

    NASA Technical Reports Server (NTRS)

    Kuhl. Christopher A.

    2009-01-01

    The Aerial Regional-Scale Environmental Survey (ARES) is a Mars exploration mission concept with the goal of taking scientific measurements of the atmosphere, surface, and subsurface of Mars by using an airplane as the payload platform. ARES team first conducted a Phase-A study for a 2007 launch opportunity, which was completed in May 2003. Following this study, significant efforts were undertaken to reduce the risk of the atmospheric flight system, under the NASA Langley Planetary Airplane Risk Reduction Project. The concept was then proposed to the Mars Scout program in 2006 for a 2011 launch opportunity. This paper summarizes the design and development of the ARES airplane propulsion subsystem beginning with the inception of the ARES project in 2002 through the submittal of the Mars Scout proposal in July 2006.

  11. Survey on effect of surface winds on aircraft design and operation and recommendations for needed wind research

    NASA Technical Reports Server (NTRS)

    Houbolt, J. C.

    1973-01-01

    A survey of the effect of environmental surface winds and gusts on aircraft design and operation is presented. A listing of the very large number of problems that are encountered is given. Attention is called to the many studies that have been made on surface winds and gusts, but development in the engineering application of these results to aeronautical problems is pointed out to be still in the embryonic stage. Control of the aircraft is of paramount concern. Mathematical models and their application in simulation studies of airplane operation and control are discussed, and an attempt is made to identify their main gaps or deficiencies. Key reference material is cited. The need for better exchange between the meteorologist and the aeronautical engineer is discussed. Suggestions for improvements in the wind and gust models are made.

  12. Quantitative analysis of human ankle characteristics at different gait phases and speeds for utilizing in ankle-foot prosthetic design

    PubMed Central

    2014-01-01

    Background Ankle characteristics vary in terms of gait phase and speed change. This study aimed to quantify the components of ankle characteristics, including quasi-stiffness and work in different gait phases and at various speeds. Methods The kinetic and kinematic data of 20 healthy participants were collected during normal gait at four speeds. Stance moment-angle curves were divided into three sub-phases including controlled plantarflexion, controlled dorsiflexion and powered plantarflexion. The slope of the moment-angle curves was quantified as quasi-stiffness. The area under the curves was defined as work. Results The lowest quasi-stiffness was observed in the controlled plantarflexion. The fitted line to moment-angle curves showed R2 > 0.8 at controlled dorsiflexion and powered plantarflexion. Quasi-stiffness was significantly different at different speeds (P = 0.00). In the controlled dorsiflexion, the ankle absorbed energy; by comparison, energy was generated in the powered plantarflexion. A negative work value was recorded at slower speeds and a positive value was observed at faster speeds. Ankle peak powers were increased with walking speed (P = 0.00). Conclusions Our findings suggested that the quasi-stiffness and work of the ankle joint can be regulated at different phases and speeds. These findings may be clinically applicable in the design and development of ankle prosthetic devices that can naturally replicate human walking at various gait speeds. PMID:24568175

  13. The role of the elastic rebound theory in design and evaluation of deformation surveys

    NASA Astrophysics Data System (ADS)

    Bnyai, L.

    1992-02-01

    The elastic rebound theory of shallow tectonic earthquakes proposed by H.F. Reid as a comprehensive description of deformation in time along an active fault is analysed from a geodetic point of view. The relationship between the different stages of deformation and the epochs of geodetic measurements emphasizes the importance of proper network design, because the three basic hypotheses investigated predict displacements characterising the process in different zones of the network. The proper choice of datum stations according to these hypotheses and the similarity transformation applied for generalized, free network determination proved to be a time-saving approach. The proposed procedure may be advantageous when the noise levels of observations are near to the expected deformations.

  14. Assessing functional relations in single-case designs: quantitative proposals in the context of the evidence-based movement.

    PubMed

    Manolov, Rumen; Sierra, Vicenta; Solanas, Antonio; Botella, Juan

    2014-11-01

    In the context of the evidence-based practices movement, the emphasis on computing effect sizes and combining them via meta-analysis does not preclude the demonstration of functional relations. For the latter aim, we propose to augment the visual analysis to add consistency to the decisions made on the existence of a functional relation without losing sight of the need for a methodological evaluation of what stimuli and reinforcement or punishment are used to control the behavior. Four options for quantification are reviewed, illustrated, and tested with simulated data. These quantifications include comparing the projected baseline with the actual treatment measurements, on the basis of either parametric or nonparametric statistics. The simulated data used to test the quantifications include nine data patterns in terms of the presence and type of effect and comprise ABAB and multiple-baseline designs. Although none of the techniques is completely flawless in terms of detecting a functional relation only when it is present but not when it is absent, an option based on projecting split-middle trend and considering data variability as in exploratory data analysis proves to be the best performer for most data patterns. We suggest that the information on whether a functional relation has been demonstrated should be included in meta-analyses. It is also possible to use as a weight the inverse of the data variability measure used in the quantification for assessing the functional relation. We offer an easy to use code for open-source software for implementing some of the quantifications. PMID:25092718

  15. Survey of Technical Preventative Measures to Reduce Whole-Body Vibration Effects when Designing Mobile Machinery

    NASA Astrophysics Data System (ADS)

    DONATI, P.

    2002-05-01

    Engineering solutions to minimize the effects on operators of vibrating mobile machinery can be conveniently grouped into three areas: Reduction of vibration at source by improvement of the quality of terrain, careful selection of vehicle or machine, correct loading, proper maintenance, etc.Reduction of vibration transmission by incorporating suspension systems (tyres, vehicle suspensions, suspension cab and seat) between the operator and the source of vibration.Improvement of cab ergonomics and seat profiles to optimize operator posture. These paper reviews the different techniques and problems linked to categories (2) and (3). According to epidemiological studies, the main health risk with whole-body vibration exposure would appear to be lower back pain. When designing new mobile machinery, all factors which may contribute to back injury should be considered in order to reduce risk. For example, optimized seat suspension is useless if the suspension seat cannot be correctly and easily adjusted to the driver's weight or if the driver is forced to drive in a bent position to avoid his head striking the ceiling due to the spatial requirement of the suspension seat.

  16. SURVEY INSTRUMENT

    DOEpatents

    Borkowski, C.J.

    1954-01-19

    This pulse-type survey instrument is suitable for readily detecting {alpha} particles in the presence of high {beta} and {gamma} backgrounds. The instruments may also be used to survey for neutrons, {beta} particles and {gamma} rays by employing suitably designed interchangeable probes and selecting an operating potential to correspond to the particular probe.

  17. Statistics of Local Public School Systems, Fall 1970: Staff. Elementary-Secondary General Information Survey Series.

    ERIC Educational Resources Information Center

    Hughes, Warren A.

    This publication is the fourth report in an annual survey series designed to provide reliable data on individual local public school systems for planning, policy, and research purposes. The report contains tables of national estimates and basic data tables providing quantitative staff data on the school systems in the survey. The data are derived…

  18. Making Full Use of the Longitudinal Design of the Current Population Survey: Methods for Linking Records Across 16 Months *

    PubMed Central

    Drew, Julia A. Rivera; Flood, Sarah; Warren, John Robert

    2015-01-01

    Data from the Current Population Survey (CPS) are rarely analyzed in a way that takes advantage of the CPS’s longitudinal design. This is mainly because of the technical difficulties associated with linking CPS files across months. In this paper, we describe the method we are using to create unique identifiers for all CPS person and household records from 1989 onward. These identifiers—available along with CPS basic and supplemental data as part of the on-line Integrated Public Use Microdata Series (IPUMS)—make it dramatically easier to use CPS data for longitudinal research across any number of substantive domains. To facilitate the use of these new longitudinal IPUMS-CPS data, we also outline seven different ways that researchers may choose to link CPS person records across months, and we describe the sample sizes and sample retention rates associated with these seven designs. Finally, we discuss a number of unique methodological challenges that researchers will confront when analyzing data from linked CPS files. PMID:26113770

  19. Flow field survey near the rotational plane of an advanced design propeller on a JetStar airplane

    NASA Technical Reports Server (NTRS)

    Walsh, K. R.

    1985-01-01

    An investigation was conducted to obtain upper fuselage surface static pressures and boundary layer velocity profiles below the centerline of an advanced design propeller. This investigation documents the upper fuselage velocity flow field in support of the in-flight acoustic tests conducted on a JetStar airplane. Initial results of the boundary layer survey show evidence of an unusual flow disturbance, which is attributed to the two windshield wiper assemblies on the aircraft. The assemblies were removed, eliminating the disturbances from the flow field. This report presents boundary layer velocity profiles at altitudes of 6096 and 9144 m (20,000 and 30,000 ft) and Mach numbers from 0.6 to 0.8, and it investigated the effects of windshield wiper assemblies on these profiles. Because of the unconventional velocity profiles that were obtained with the assemblies mounted, classical boundary layer parameters, such as momentum and displacement thicknesses, are not presented. The effects of flight test variables (Mach number and angles of attack and sideslip) and an advanced design propeller on boundary layer profiles - with the wiper assemblies mounted and removed - are presented.

  20. Willingness to pay for treated mosquito nets in Surat, India: the design and descriptive analysis of a household survey.

    PubMed

    Bhatia, M R; Fox-Rushby, J A

    2002-12-01

    For willingness to pay (WTP) studies to have an appropriate impact on policy making, it is essential that the design and analysis are undertaken carefully. This paper aims to describe and justify the design of the survey tool used to assess hypothetical WTP for treated mosquito nets (TMN) in rural Surat, India and report its findings. Results from qualitative work were used as an input for developing the WTP questionnaire. A total of 1200 households belonging to 80 villages in rural Surat were selected for the study. A bidding format was used to elicit WTP values, using three different starting bids. The scenario was constructed in a way to reduce the possibility of respondents acting strategically. The response rate was 100%. About 79% of the respondents were willing to buy TMNs and the mean WTP was Rs57. Descriptive results of economic and other taste and preference variables are also presented, which include preventive measures used by households and treatment seeking behaviour for malaria. It is observed that WTP as well as demographic variables and prevention methods differ significantly across arms of the trial. This paper suggests that policy-makers could use the evidence following further analysis, along with information on costs of implementation, to ascertain the levels of subsidy that may be needed at different levels of coverage. PMID:12424212

  1. Tailings Pond Characterization And Designing Through Geophysical Surveys In Dipping Sedimentary Formations

    NASA Astrophysics Data System (ADS)

    Muralidharan, D.; Andrade, R.; Anand, K.; Sathish, R.; Goud, K.

    2009-12-01

    Mining activities results into generation of disintegrated waste materials attaining increased mobilization status and requires a safe disposal mechanism through back filling process or secluded storage on surface with prevention of its interaction with environment cycle. The surface disposal of waste materials will become more critical in case of mined minerals having toxic or radioactive elements. In such cases, the surface disposal site is to be characterized for its sub-surface nature to understand its role in environmental impact due to the loading of waste materials. Near surface geophysics plays a major role in mapping the geophysical characters of the sub-surface formations in and around the disposal site and even to certain extent helps in designing of the storage structure. Integrated geophysical methods involving resistivity tomography, ground magnetic and shallow seismic studies were carried out over proposed tailings pond area of 0.3 sq. kms underlined by dipping sedimentary rocks consisting of ferruginous shales and dolomitic to siliceous limestone with varying thicknesses. The investigated site being located in tectonically disturbed area, geophysical investigations were carried out with number of profiles to visualize the sub-surface nature with clarity. The integration of results of twenty profiles of resistivity tomography with 2 m (shallow) and 10 m (moderate depth) electrode spacing’s enabled in preparing probable sub-surface geological section along the strike direction of the formation under the tailings pond with some geo-tectonic structure inferred to be a fault. Similarly, two resistivity tomography profiles perpendicular to the strike direction of the formations brought out the existence of buried basic intrusive body on the northern boundary of the proposed tailings pond. Two resistivity tomography profiles in criss-cross direction over the suspected fault zone confirmed fault existence on the north-eastern part of tailings pond. Thirty two magnetic profiles inside the tailings pond and surrounding areas on the southern part of the tailings pond enabled in identifying two parallel east-west intrusive bodies forming the impermeable boundary for the tailings pond. The shallow seismic refraction and the geophysical studies in and around the proposed tailings pond brought out the suitability of the site, even when the toxic elements percolates through the subsurface formations in to the groundwater system, the existence of dykes on either side of the proposed ponding area won’t allow the water to move across them thus by restricting the contamination within the tailings pond area. Similarly, the delineation of a fault zone within the tailings pond area helped in shifting the proposed dam axis of the pond to avoid leakage through the fault zone causing concern to environment pollution.

  2. Quantitative plant proteomics.

    PubMed

    Bindschedler, Laurence V; Cramer, Rainer

    2011-02-01

    Quantitation is an inherent requirement in comparative proteomics and there is no exception to this for plant proteomics. Quantitative proteomics has high demands on the experimental workflow, requiring a thorough design and often a complex multi-step structure. It has to include sufficient numbers of biological and technical replicates and methods that are able to facilitate a quantitative signal read-out. Quantitative plant proteomics in particular poses many additional challenges but because of the nature of plants it also offers some potential advantages. In general, analysis of plants has been less prominent in proteomics. Low protein concentration, difficulties in protein extraction, genome multiploidy, high Rubisco abundance in green tissue, and an absence of well-annotated and completed genome sequences are some of the main challenges in plant proteomics. However, the latter is now changing with several genomes emerging for model plants and crops such as potato, tomato, soybean, rice, maize and barley. This review discusses the current status in quantitative plant proteomics (MS-based and non-MS-based) and its challenges and potentials. Both relative and absolute quantitation methods in plant proteomics from DIGE to MS-based analysis after isotope labeling and label-free quantitation are described and illustrated by published studies. In particular, we describe plant-specific quantitative methods such as metabolic labeling methods that can take full advantage of plant metabolism and culture practices, and discuss other potential advantages and challenges that may arise from the unique properties of plants. PMID:21246733

  3. Design, methods and demographic findings of the DEMINVALL survey: a population-based study of Dementia in Valladolid, Northwestern Spain

    PubMed Central

    2012-01-01

    Background This article describes the rationale and design of a population-based survey of dementia in Valladolid (northwestern Spain). The main aim of the study was to assess the epidemiology of dementia and its subtypes. Prevalence of anosognosia in dementia patients, nutritional status, diet characteristics, and determinants of non-diagnosed dementia in the community were studied. The main sociodemographic, educational, and general health status characteristics of the study population are described. Methods Cross-over and cohort, population-based study. A two-phase door-to-door study was performed. Both urban and rural environments were included. In phase 1 (February 2009 – February 2010) 28 trained physicians examined a population of 2,989 subjects (age: ≥ 65 years). The seven-minute screen neurocognitive battery was used. In phase 2 (May 2009 – May 2010) 4 neurologists, 1 geriatrician, and 3 neuropsychologists confirmed the diagnosis of dementia and subtype in patients screened positive by a structured neurological evaluation. Specific instruments to assess anosognosia, the nutritional status and diet characteristics were used. Of the initial sample, 2,170 subjects were evaluated (57% female, mean age 76.5 ± 7.8, 5.2% institutionalized), whose characteristics are described. 227 persons were excluded for various reasons. Among those eligible were 592 non-responders. The attrition bias of non-responders was lower in rural areas. 241 screened positive (11.1%). Discussion The survey will explore some clinical, social and health related life-style variables of dementia. The population size and the diversification of social and educational backgrounds will contribute to a better knowledge of dementia in our environment. PMID:22935626

  4. National Aquatic Resource Surveys: Use of Geospatial data in their design and spatial prediction at non-monitored locations

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are four surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams, estuaries and intracoa...

  5. On Quantitizing

    PubMed Central

    Sandelowski, Margarete; Voils, Corrine I.; Knafl, George

    2009-01-01

    Quantitizing, commonly understood to refer to the numerical translation, transformation, or conversion of qualitative data, has become a staple of mixed methods research. Typically glossed are the foundational assumptions, judgments, and compromises involved in converting disparate data sets into each other and whether such conversions advance inquiry. Among these assumptions are that qualitative and quantitative data constitute two kinds of data, that quantitizing constitutes a unidirectional process essentially different from qualitizing, and that counting is an unambiguous process. Among the judgments are deciding what and how to count. Among the compromises are balancing numerical precision with narrative complexity. The standpoints of “conditional complementarity,” “critical remediation,” and “analytic alternation” clarify the added value of converting qualitative data into quantitative form. PMID:19865603

  6. A mental health needs assessment of children and adolescents in post-conflict Liberia: results from a quantitative key-informant survey

    PubMed Central

    Borba, Christina P.C.; Ng, Lauren C.; Stevenson, Anne; Vesga-Lopez, Oriana; Harris, Benjamin L.; Parnarouskis, Lindsey; Gray, Deborah A.; Carney, Julia R.; Domínguez, Silvia; Wang, Edward K.S.; Boxill, Ryan; Song, Suzan J.; Henderson, David C.

    2016-01-01

    Between 1989 and 2004, Liberia experienced a devastating civil war that resulted in widespread trauma with almost no mental health infrastructure to help citizens cope. In 2009, the Liberian Ministry of Health and Social Welfare collaborated with researchers from Massachusetts General Hospital to conduct a rapid needs assessment survey in Liberia with local key informants (n = 171) to examine the impact of war and post-war events on emotional and behavioral problems of, functional limitations of, and appropriate treatment settings for Liberian youth aged 5–22. War exposure and post-conflict sexual violence, poverty, infectious disease and parental death negatively impacted youth mental health. Key informants perceived that youth displayed internalizing and externalizing symptoms and mental health-related functional impairment at home, school, work and in relationships. Medical clinics were identified as the most appropriate setting for mental health services. Youth in Liberia continue to endure the harsh social, economic and material conditions of everyday life in a protracted post-conflict state, and have significant mental health needs. Their observed functional impairment due to mental health issues further limited their access to protective factors such as education, employment and positive social relationships. Results from this study informed Liberia's first post-conflict mental health policy. PMID:26807147

  7. A Study Investigating Indian Middle School Students' Ideas of Design and Designers

    ERIC Educational Resources Information Center

    Ara, Farhat; Chunawala, Sugra; Natarajan, Chitra

    2011-01-01

    This paper reports on an investigation into middle school students' naive ideas about, and attitudes towards design and designers. The sample for the survey consisted of students from Classes 7 to 9 from a school located in Mumbai. The data were analysed qualitatively and quantitatively to look for trends in students' responses. Results show that…

  8. Isotopic analogs as internal standards for quantitative analyses by GC/MS--evaluation of cross-contribution to ions designated for the analyte and the isotopic internal standard.

    PubMed

    Chang, W T; Lin, D L; Liu, R H

    2001-10-01

    Isotopic analogs of the analytes are currently preferred internal standards (IS) for quantitative analyses of drugs and their metabolites in biological matrices by GC/MS procedures. Contributions of the analyte and the IS to the intensities of ions designated for the IS and the analyte, respectively--an undesirable phenomenon termed "cross-contribution"--greatly weakens the effectiveness of this approach. The cross-contribution phenomenon has been, in the past, evaluated by a "direct measurement" approach, in which intensities of interested ions were measured in two separate experiments using equal quantities of the analyte and the IS. Alternate procedures that may generate improved results are hereby studied. For the "improved direct measurement" approach, ion intensity data derived from the previously reported direct measurement procedure are first normalized before being used to calculate the extent of cross-contribution. An "internal standard" approach is also developed, in which a set amount of a third compound is incorporated into these two separate experiments, thus allowing corrections of ion intensity data that are imbedded with variations inherent to separate experiments. Finally, a "standard addition" approach, involving a series "addition" of "standards", generates multiple data points; thus, providing a mechanism to validate the resulting cross-contribution data. Secobarbital/(2)H(5)-secobarbital and secobarbital/(13)C(4)-secobarbital pairs are adapted as the exemplar systems for this study. PMID:11566421

  9. Design and evaluation of a 16S rRNA-targeted oligonucleotide probe for specific detection and quantitation of human faecal Bacteroides populations.

    PubMed

    Doré, J; Sghir, A; Hannequart-Gramet, G; Corthier, G; Pochart, P

    1998-03-01

    Colonic Bacteroides include several species which, by their population level and activities, are significant contributers to the metabolic activity and health of man and animals. Yet, the understanding of their ecology has been hampered by the lack of highly specific and reliable enumeration techniques. Based on 16S rRNA sequence comparisons within the available database, we have designed an 18-mer oligonucleotide that targets a region common to-and specific for the Bacteroides-Porphyromonas-Prevotella group. We have tested the specificity of the probe and its usefulness for studies of human faecal samples. Under experimentally optimized hybridization conditions, the probe was shown to similarly recognize the rDNA obtained from 40 strains representing 8 species of the Bacteroides-Porphyromonas-Prevotella group. Importantly, it did not recognize 31 strains of microorganisms representing 8 genera of the dominant human faecal microbiota. Among selected colonies of dominant microorganisms of the faecal flora of two human individuals, strains identified as B. vulgatus by immunoblots using a species-specific monoclonal antibody were all detected by the probe. Colony hybridization was used to enumerate total Bacteroides-group microorganisms in faecal specimen from children and adults. The probe described therein was further used in quantitative RNA blots to monitor fluctuations of the Bacteroides-group versus Bifidobacterium genus in frozen faecal samples from a child between 85 and 125 days of age. It will be applicable to similar investigations of other anaerobic environments. PMID:9741111

  10. Design of a Survey for Determining Training and Personnel Requirements for Educational Research, Development, Dissemination and Evaluation. Vol. 2, Development Pretest of Questionnaires. Final Report.

    ERIC Educational Resources Information Center

    Rittenhouse, Carl

    This study describes the development and design for pretesting survey instruments required for the development of programs sampling the supply and demand for educational research, development, diffusion, and evaluation personnel. The major areas of concern include: 1) the determination of number, distribution by type, and location of educational…

  11. A Dish-based Semi-quantitative Food Frequency Questionnaire for Assessment of Dietary Intakes in Epidemiologic Studies in Iran: Design and Development

    PubMed Central

    Keshteli, AH; Esmaillzadeh, Ahmad; Rajaie, Somayeh; Askari, Gholamreza; Feinle-Bisset, Christine; Adibi, Peyman

    2014-01-01

    Background: Earlier forms of food frequency questionnaire (FFQ) used in Iran have extensive lists of foods, traditional categories and food-based design, mostly with the interviewer-administered approach. The aim of the current paper is to describe the development of a dish-based, machine-readable, semi-quantitative food frequency questionnaire (DFQ). Methods: Within the framework of the Study on the Epidemiology of Psychological, Alimentary Health and Nutrition project, we created a novel FFQ using Harvard FFQ as a model. Results: The following steps were taken to develop the questionnaire: Construction of a list of commonly consumed Iranian foods, definition of portion sizes, design of response options for consumption frequency of each food item and finally a pilot test of the preliminary DFQ. From a comprehensive list of foods and mixed dishes, we included those that were nutrient-rich, consumed reasonably often or contributed to between-person variations. We focused on mixed dishes, rather than their ingredients, along with foods. To shorten the list, the related food items or mixed dishes were categorized together in one food group. These exclusions resulted in a list of 106 foods or dishes in the questionnaire. The portion sizes used in the FFQ were obtained from our earlier studies that used dietary recalls and food records. The frequency response options for the food list varied from 6-9 choices from “never or less than once a month” to “12 or more times per day”. Conclusions: The DFQ could be a reasonable dietary assessment tool for future epidemiological studies in the country. Validation studies are required to assess the validity and reliability of this newly developed questionnaire. PMID:24554989

  12. Power of quantitative trait locus mapping for polygenic binary traits using generalized and regression interval mapping in multi-family half-sib designs.

    PubMed

    Kadarmideen, H N; Janss, L L; Dekkers, J C

    2000-12-01

    A generalized interval mapping (GIM) method to map quantitative trait loci (QTL) for binary polygenic traits in a multi-family half-sib design is developed based on threshold theory and implemented using a Newton-Raphson algorithm. Statistical power and bias of QTL mapping for binary traits by GIM is compared with linear regression interval mapping (RIM) using simulation. Data on 20 paternal half-sib families were simulated with two genetic markers that bracketed an additive QTL. Data simulated and analysed were: (1) data on the underlying normally distributed liability (NDL) scale, (2) binary data created by truncating NDL data based on three thresholds yielding data sets with three different incidences, and (3) NDL data with polygenic and QTL effects reduced by a proportion equal to the ratio of the heritabilities on the binary versus NDL scale (reduced-NDL). Binary data were simulated with and without systematic environmental (herd) effects in an unbalanced design. GIM and RIM gave similar power to detect the QTL and similar estimates of QTL location, effects and variances. Presence of fixed effects caused differences in bias between RIM and GIM, where GIM showed smaller bias which was affected less by incidence. The original NDL data had higher power and lower bias in QTL parameter estimates than binary and reduced-NDL data. RIM for reduced-NDL and binary data gave similar power and estimates of QTL parameters, indicating that the impact of the binary nature of data on QTL analysis is equivalent to its impact on heritability. PMID:11204977

  13. NATIONAL STREAM SURVEY: PHASE 1 QUALITY ASSURANCE REPORT

    EPA Science Inventory

    The National Stream Survey - Phase I, conducted during the spring of 1986, was designed to assess quantitatively the present chemical status of streams in regions of the eastern United States where aquatic resources are potentially at risk as a result of acidic deposition. A qual...

  14. Surveying the Commons: Current Implementation of Information Commons Web sites

    ERIC Educational Resources Information Center

    Leeder, Christopher

    2009-01-01

    This study assessed the content of 72 academic library Information Commons (IC) Web sites using content analysis, quantitative assessment and qualitative surveys of site administrators to analyze current implementation by the academic library community. Results show that IC Web sites vary widely in content, design and functionality, with few…

  15. Surveying the Commons: Current Implementation of Information Commons Web sites

    ERIC Educational Resources Information Center

    Leeder, Christopher

    2009-01-01

    This study assessed the content of 72 academic library Information Commons (IC) Web sites using content analysis, quantitative assessment and qualitative surveys of site administrators to analyze current implementation by the academic library community. Results show that IC Web sites vary widely in content, design and functionality, with few

  16. Biological effect of low-head sea lamprey barriers: Designs for extensive surveys and the value of incorporating intensive process-oriented research

    USGS Publications Warehouse

    Hayes, D.B.; Baylis, J.R.; Carl, L.M.; Dodd, H.R.; Goldstein, J.D.; McLaughlin, R.L.; Noakes, D.L.G.; Porto, L.M.

    2003-01-01

    Four sampling designs for quantifying the effect of low-head sea lamprey (Petromyzon marinus) barriers on fish communities were evaluated, and the contribution of process-oriented research to the overall confidence of results obtained was discussed. The designs include: (1) sample barrier streams post-construction; (2) sample barrier and reference streams post-construction; (3) sample barrier streams pre- and post-construction; and (4) sample barrier and reference streams pre- and post-construction. In the statistical literature, the principal basis for comparison of sampling designs is generally the precision achieved by each design. In addition to precision, designs should be compared based on the interpretability of results and on the scale to which the results apply. Using data collected in a broad survey of streams with and without sea lamprey barriers, some of the tradeoffs that occur among precision, scale, and interpretability are illustrated. Although circumstances such as funding and availability of pre-construction data may limit which design can be implemented, a pre/post-construction design including barrier and reference streams provides the most meaningful information for use in barrier management decisions. Where it is not feasible to obtain pre-construction data, a design including reference streams is important to maintain the interpretability of results. Regardless of the design used, process-oriented research provides a framework for interpreting results obtained in broad surveys. As such, information from both extensive surveys and intensive process-oriented research provides the best basis for fishery management actions, and gives researchers and managers the most confidence in the conclusions reached regarding the effects of sea lamprey barriers.

  17. Geothermal energy as a source of electricity. A worldwide survey of the design and operation of geothermal power plants

    SciTech Connect

    DiPippo, R.

    1980-01-01

    An overview of geothermal power generation is presented. A survey of geothermal power plants is given for the following countries: China, El Salvador, Iceland, Italy, Japan, Mexico, New Zealand, Philippines, Turkey, USSR, and USA. A survey of countries planning geothermal power plants is included. (MHR)

  18. A Mixed Model Design Study of RN to BS Distance Learning:Survey of Graduates' Perceptions of Strengths and Challenges

    ERIC Educational Resources Information Center

    Lock, Leonard K.; Schnell, Zoanne; Pratt-Mullen, Jerrilynn

    2011-01-01

    This article reports on findings from a survey administered to graduates of a distance learning RN-to-BS completion program. A questionnaire was constructed to examine graduate experiences and perceptions regarding distance learning formats, course content, time management, student empowerment, and program support. A total of 251 surveys were…

  19. Quantitative Thinking.

    ERIC Educational Resources Information Center

    DuBridge, Lee A.

    An appeal for more research to determine how to educate children as effectively as possible is made. Mathematics teachers can readily examine the educational problems of today in their classrooms since learning progress in mathematics can easily be measured and evaluated. Since mathematics teachers have learned to think in quantitative terms and…

  20. On Quantitizing

    ERIC Educational Resources Information Center

    Sandelowski, Margarete; Voils, Corrine I.; Knafl, George

    2009-01-01

    "Quantitizing", commonly understood to refer to the numerical translation, transformation, or conversion of qualitative data, has become a staple of mixed methods research. Typically glossed are the foundational assumptions, judgments, and compromises involved in converting disparate data sets into each other and whether such conversions advance…

  1. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  2. Application of screening experimental designs to assess chromatographic isotope effect upon isotope-coded derivatization for quantitative liquid chromatography-mass spectrometry.

    PubMed

    Szarka, Szabolcs; Prokai-Tatrai, Katalin; Prokai, Laszlo

    2014-07-15

    Isotope effect may cause partial chromatographic separation of labeled (heavy) and unlabeled (light) isotopologue pairs. Together with a simultaneous matrix effect, this could lead to unacceptable accuracy in quantitative liquid chromatography-mass spectrometry assays, especially when electrospray ionization is used. Four biologically relevant reactive aldehydes (acrolein, malondialdehyde, 4-hydroxy-2-nonenal, and 4-oxo-2-nonenal) were derivatized with light or heavy (d3-, (13)C6-, (15)N2-, or (15)N4-labeled) 2,4-dinitrophenylhydrazine and used as model compounds to evaluate chromatographic isotope effects. For comprehensive assessment of retention time differences between light/heavy pairs under various gradient reversed-phase liquid chromatography conditions, major chromatographic parameters (stationary phase, mobile phase pH, temperature, organic solvent, and gradient slope) and different isotope labelings were addressed by multiple-factor screening using experimental designs that included both asymmetrical (Addelman) and Plackett-Burman schemes followed by statistical evaluations. Results confirmed that the most effective approach to avoid chromatographic isotope effect is the use of (15)N or (13)C labeling instead of deuterium labeling, while chromatographic parameters had no general influence. Comparison of the alternate isotope-coded derivatization assay (AIDA) using deuterium versus (15)N labeling gave unacceptable differences (>15%) upon quantifying some of the model aldehydes from biological matrixes. On the basis of our results, we recommend the modification of the AIDA protocol by replacing d3-2,4-dinitrophenylhydrazine with (15)N- or (13)C-labeled derivatizing reagent to avoid possible unfavorable consequences of chromatographic isotope effects. PMID:24922593

  3. Application of Screening Experimental Designs to Assess Chromatographic Isotope Effect upon Isotope-Coded Derivatization for Quantitative Liquid Chromatography–Mass Spectrometry

    PubMed Central

    2015-01-01

    Isotope effect may cause partial chromatographic separation of labeled (heavy) and unlabeled (light) isotopologue pairs. Together with a simultaneous matrix effect, this could lead to unacceptable accuracy in quantitative liquid chromatography–mass spectrometry assays, especially when electrospray ionization is used. Four biologically relevant reactive aldehydes (acrolein, malondialdehyde, 4-hydroxy-2-nonenal, and 4-oxo-2-nonenal) were derivatized with light or heavy (d3-, 13C6-, 15N2-, or 15N4-labeled) 2,4-dinitrophenylhydrazine and used as model compounds to evaluate chromatographic isotope effects. For comprehensive assessment of retention time differences between light/heavy pairs under various gradient reversed-phase liquid chromatography conditions, major chromatographic parameters (stationary phase, mobile phase pH, temperature, organic solvent, and gradient slope) and different isotope labelings were addressed by multiple-factor screening using experimental designs that included both asymmetrical (Addelman) and Plackett–Burman schemes followed by statistical evaluations. Results confirmed that the most effective approach to avoid chromatographic isotope effect is the use of 15N or 13C labeling instead of deuterium labeling, while chromatographic parameters had no general influence. Comparison of the alternate isotope-coded derivatization assay (AIDA) using deuterium versus 15N labeling gave unacceptable differences (>15%) upon quantifying some of the model aldehydes from biological matrixes. On the basis of our results, we recommend the modification of the AIDA protocol by replacing d3-2,4-dinitrophenylhydrazine with 15N- or 13C-labeled derivatizing reagent to avoid possible unfavorable consequences of chromatographic isotope effects. PMID:24922593

  4. Rules for the preparation of manuscript and illustrations designed for publication by the United States Geological Survey

    USGS Publications Warehouse

    Hampson, Thomas

    1888-01-01

    In the annual report of the Director of the U. S. Geological Survey for 1885-'86, pages 40 and 41, you set forth the functions of the chief of the editorial division as follows: "To secure clear and accurate statement in the material sent to press, careful proof-reading, and uniformity in the details of book-making, as well as to assist the Director in exercising a general supervision over the publications of the Survey."

  5. Risk-based design of repeated surveys for the documentation of freedom from non-highly contagious diseases.

    PubMed

    Hadorn, Daniela C; Rüfenacht, Jürg; Hauser, Ruth; Stärk, Katharina D C

    2002-12-30

    The documentation of freedom from disease requires reliable information on the actual disease status in a specific animal population. The implementation of active surveillance (surveys) is an effective method to gain this information. For economical reasons, the sample size should be as small as possible but large enough to achieve the required confidence level for a targeted threshold. When conducting surveys repeatedly, various information sources about the disease status of the population can be taken into account to adjust the required level of confidence for a follow-up survey (e.g. risk assessments regarding disease introduction and results of previous surveys). As a benefit, the sample size for national surveys can be reduced considerably. We illustrate this risk-based approach using examples of national surveys conducted in Switzerland. The sample size for the documentation of freedom from enzootic bovine leucosis (EBL) and Brucella melitensis in sheep and in goats could be reduced from 2325 to 415 cattle herds, from 2325 to 838 sheep herds and from 1975 to 761 goat herds, respectively. PMID:12441234

  6. Design of the Digital Sky Survey DA and online system: A case history in the use of computer aided tools for data acquisition system design

    NASA Astrophysics Data System (ADS)

    Petravick, D.; Berman, E.; Nicinski, T.; Rechenmacher, R.; Oleynik, G.; Pordes, R.; Stoughton, C.

    1991-06-01

    As part of its expanding Astrophysics program, Fermilab is participating in the Digital Sky Survey (DSS). Fermilab is part of a collaboration involving University of Chicago, Princeton University, and the Institute of Advanced Studies (at Princeton). The DSS main results will be a photometric imaging survey and a redshift survey of galaxies and color-selected quasars over pi steradians of the Northern Galactic Cap. This paper focuses on our use of Computer Aided Software Engineering (CASE) in specifying the data system for DSS. Extensions to standard methodologies were necessary to compensate for tool shortcomings and to improve communication amongst the collaboration members. One such important extension was the incorporation of CASE information into the specification document.

  7. Design of the Digital Sky Survey DA and online system---A case history in the use of computer aided tools for data acquisition system design

    SciTech Connect

    Petravick, D.; Berman, E.; Nicinski, T.; Rechenmacher, R.; Oleynik, G.; Pordes, R.; Stoughton, C.

    1991-06-01

    As part of its expanding Astrophysics program, Fermilab is participating in the Digital Sky Survey (DSS). Fermilab is part of a collaboration involving University of Chicago, Princeton University, and the Institute of Advanced Studies (at Princeton). DSS main results will be a photometric imaging survey and a redshift survey of galaxies and color-selected quasars over {pi} steradians of the Northern Galactic Cap. This paper focuses on our use of Computer Aided Software Engineering (CASE) in specifying the data system for DSS. Extensions to standard'' methodologies were necessary to compensate for tool shortcomings and to improve communication amongst the collaboration members. One such important extension was the incorporation of CASE information into the specification document. 7 refs.

  8. ESO imaging survey: optical deep public survey

    NASA Astrophysics Data System (ADS)

    Mignano, A.; Miralles, J.-M.; da Costa, L.; Olsen, L. F.; Prandoni, I.; Arnouts, S.; Benoist, C.; Madejsky, R.; Slijkhuis, R.; Zaggia, S.

    2007-02-01

    This paper presents new five passbands (UBVRI) optical wide-field imaging data accumulated as part of the DEEP Public Survey (DPS) carried out as a public survey by the ESO Imaging Survey (EIS) project. Out of the 3 square degrees originally proposed, the survey covers 2.75 square degrees, in at least one band (normally R), and 1.00 square degrees in five passbands. The median seeing, as measured in the final stacked images, is 0.97 arcsec, ranging from 0.75 arcsec to 2.0 arcsec. The median limiting magnitudes (AB system, 2´´ aperture, 5σ detection limit) are UAB=25.65, BAB=25.54, VAB=25.18, RAB = 24.8 and IAB =24.12 mag, consistent with those proposed in the original survey design. The paper describes the observations and data reduction using the EIS Data Reduction System and its associated EIS/MVM library. The quality of the individual images were inspected, bad images discarded and the remaining used to produce final image stacks in each passband, from which sources have been extracted. Finally, the scientific quality of these final images and associated catalogs was assessed qualitatively by visual inspection and quantitatively by comparison of statistical measures derived from these data with those of other authors as well as model predictions, and from direct comparison with the results obtained from the reduction of the same dataset using an independent (hands-on) software system. Finally to illustrate one application of this survey, the results of a preliminary effort to identify sub-mJy radio sources are reported. To the limiting magnitude reached in the R and I passbands the success rate ranges from 66 to 81% (depending on the fields). These data are publicly available at CDS. Based on observations carried out at the European Southern Observatory, La Silla, Chile under program Nos. 164.O-0561, 169.A-0725, and 267.A-5729. Appendices A, B and C are only available in electronic form at http://www.aanda.org

  9. The Multiwavelength Survey by Yale-Chile (MUSYC): Survey Design and Deep Public UBVRIz' Images and Catalogs of the Extended Hubble Deep Field-South

    NASA Astrophysics Data System (ADS)

    Gawiser, Eric; van Dokkum, Pieter G.; Herrera, David; Maza, José; Castander, Francisco J.; Infante, Leopoldo; Lira, Paulina; Quadri, Ryan; Toner, Ruth; Treister, Ezequiel; Urry, C. Megan; Altmann, Martin; Assef, Roberto; Christlein, Daniel; Coppi, Paolo S.; Durán, MarÍa Fernanda; Franx, Marijn; Galaz, Gaspar; Huerta, Leonor; Liu, Charles; López, Sebastián; Méndez, René; Moore, David C.; Rubio, Mónica; Ruiz, MarÍa Teresa; Toft, Sune; Yi, Sukyoung K.

    2006-01-01

    We present UBVRIz' optical images taken with MOSAIC on the CTIO 4 m telescope of the 0.32 deg2 Extended Hubble Deep Field-South. This is one of four fields comprising the MUSYC survey, which is optimized for the study of galaxies at z=3, active galactic nucleus (AGN) demographics, and Galactic structure. Our methods used for astrometric calibration, weighted image combination, and photometric calibration in AB magnitudes are described. We calculate corrected aperture photometry and its uncertainties and find through tests that these provide a significant improvement upon standard techniques. Our photometric catalog of 62,968 objects is complete to a total magnitude of RAB=25, with R-band counts consistent with results from the literature. We select z~=3 Lyman break galaxy (LBG) candidates from their UVR colors and find a sky surface density of 1.4 arcmin-2 and an angular correlation function w(θ)=(2.3+/-1.0)θ-0.8, consistent with previous findings that high-redshift Lyman break galaxies reside in massive dark matter halos. Our images and catalogs are available online. Based on observations obtained at Cerro Tololo Inter-American Observatory, a division of the National Optical Astronomy Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under cooperative agreement with the National Science Foundation.

  10. Yeasts in floral nectar: a quantitative survey

    PubMed Central

    Herrera, Carlos M.; de Vega, Clara; Canto, Azucena; Pozo, María I.

    2009-01-01

    Background and Aims One peculiarity of floral nectar that remains relatively unexplored from an ecological perspective is its role as a natural habitat for micro-organisms. This study assesses the frequency of occurrence and abundance of yeast cells in floral nectar of insect-pollinated plants from three contrasting plant communities on two continents. Possible correlations between interspecific differences in yeast incidence and pollinator composition are also explored. Methods The study was conducted at three widely separated areas, two in the Iberian Peninsula (Spain) and one in the Yucatán Peninsula (Mexico). Floral nectar samples from 130 species (37–63 species per region) in 44 families were examined microscopically for the presence of yeast cells. For one of the Spanish sites, the relationship across species between incidence of yeasts in nectar and the proportion of flowers visited by each of five major pollinator categories was also investigated. Key Results Yeasts occurred regularly in the floral nectar of many species, where they sometimes reached extraordinary densities (up to 4 × 105 cells mm−3). Depending on the region, between 32 and 44 % of all nectar samples contained yeasts. Yeast cell densities in the order of 104 cells mm−3 were commonplace, and densities >105 cells mm−3 were not rare. About one-fifth of species at each site had mean yeast cell densities >104 cells mm−3. Across species, yeast frequency and abundance were directly correlated with the proportion of floral visits by bumble-bees, and inversely with the proportion of visits by solitary bees. Conclusions Incorporating nectar yeasts into the scenario of plant–pollinator interactions opens up a number of intriguing avenues for research. In addition, with yeasts being as ubiquitous and abundant in floral nectars as revealed by this study, and given their astounding metabolic versatility, studies focusing on nectar chemical features should carefully control for the presence of yeasts in nectar samples. PMID:19208669

  11. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    ERIC Educational Resources Information Center

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  12. EuropeaN Energy balance Research to prevent excessive weight Gain among Youth (ENERGY) project: Design and methodology of the ENERGY cross-sectional survey

    PubMed Central

    2011-01-01

    Background Obesity treatment is by large ineffective long term, and more emphasis on the prevention of excessive weight gain in childhood and adolescence is warranted. To inform energy balance related behaviour (EBRB) change interventions, insight in the potential personal, family and school environmental correlates of these behaviours is needed. Studies on such multilevel correlates of EBRB among schoolchildren in Europe are lacking. The ENERGY survey aims to (1) provide up-to-date prevalence rates of measured overweight, obesity, self-reported engagement in EBRBs, and objective accelerometer-based assessment of physical activity and sedentary behaviour and blood-sample biomarkers of metabolic function in countries in different regions of Europe, (2) to identify personal, family and school environmental correlates of these EBRBs. This paper describes the design, methodology and protocol of the survey. Method/Design A school-based cross-sectional survey was carried out in 2010 in seven different European countries; Belgium, Greece, Hungary, the Netherlands, Norway, Slovenia, and Spain. The survey included measurements of anthropometrics, child, parent and school-staff questionnaires, and school observations to measure and assess outcomes (i.e. height, weight, and waist circumference), EBRBs and potential personal, family and school environmental correlates of these behaviours including the social-cultural, physical, political, and economic environmental factors. In addition, a selection of countries conducted accelerometer measurements to objectively assess physical activity and sedentary behaviour, and collected blood samples to assess several biomarkers of metabolic function. Discussion The ENERGY survey is a comprehensive cross-sectional study measuring anthropometrics and biomarkers as well as assessing a range of EBRBs and their potential correlates at the personal, family and school level, among 10-12 year old children in seven European countries. This study will result in a unique dataset, enabling cross country comparisons in overweight, obesity, risk behaviours for these conditions as well as the correlates of engagement in these risk behaviours. PMID:21281466

  13. A WHOLE-LAKE WATER QUALITY SURVEY OF LAKE OAHE BASED ON A SPATIALLY-BALANCED PROBABILISTIC DESIGN

    EPA Science Inventory

    Assessing conditions on large bodies of water presets multiple statistical and logistical challenges. As part of the Upper Missouri River Program of the Environmental Monitoring and Assessment Project (EMAP) we surveyed water quality of Lake Oahe in July-August, 2002 using a spat...

  14. STREAM CHEMISTRY IN THE EASTERN UNITED STATES: I. SYNOPTIC SURVEY DESIGN, ACID-BASE STATUS, AND REGIONAL PATTERNS

    EPA Science Inventory

    To assess the regional acid-base status of streams in the Mid-Atlantic and Southeastern United States, spring baseflow chemistry was surveyed in a probability sample of 500 stream reaches representing a population of 64,300 reaches. Approximately half of the streams had acid neut...

  15. Participant Dropout as a Function of Survey Length in Internet-Mediated University Studies: Implications for Study Design and Voluntary Participation in Psychological Research

    PubMed Central

    2010-01-01

    Abstract Internet-mediated research has offered substantial advantages over traditional laboratory-based research in terms of efficiently and affordably allowing for the recruitment of large samples of participants for psychology studies. Core technical, ethical, and methodological issues have been addressed in recent years, but the important issue of participant dropout has received surprisingly little attention. Specifically, web-based psychology studies often involve undergraduates completing lengthy and time-consuming batteries of online personality questionnaires, but no known published studies to date have closely examined the natural course of participant dropout during attempted completion of these studies. The present investigation examined participant dropout among 1,963 undergraduates completing one of six web-based survey studies relatively representative of those conducted in university settings. Results indicated that 10% of participants could be expected to drop out of these studies nearly instantaneously, with an additional 2% dropping out per 100 survey items included in the study. For individual project investigators, these findings hold ramifications for study design considerations, such as conducting a priori power analyses. The present results also have broader ethical implications for understanding and improving voluntary participation in research involving human subjects. Nonetheless, the generalizability of these conclusions may be limited to studies involving similar design or survey content. PMID:21142995

  16. Preventing pitfalls in patient surveys.

    PubMed

    Steiber, S R

    1989-05-01

    Properly conceived, customer satisfaction surveys can yield the quantitative data needed to gauge patient satisfaction. But, as the author notes, these surveys can be "a veritable mine field of surprises for the uninitiated." This article, the last in a three-part series on measuring patient satisfaction, describes potential pitfalls and discusses the merits of in-person, mail and telephone surveys. PMID:10293191

  17. Detailed flow surveys of turning vanes designed for a 0.1-scale model of NASA Lewis Research Center's proposed altitude wind tunnel

    NASA Technical Reports Server (NTRS)

    Moore, Royce D.; Shyne, Rickey J.; Boldman, Donald R.; Gelder, Thomas F.

    1987-01-01

    Detailed flow surveys downstream of the corner turning vanes and downstream of the fan inlet guide vanes have been obtained in a 0.1-scale model of the NASA Lewis Research Center's proposed Altitude Wind Tunnel. Two turning vane designs were evaluated in both corners 1 and 2 (the corners between the test section and the drive fan). Vane A was a controlled-diffusion airfoil and vane B was a circular-arc airfoil. At given flows the turning vane wakes were surveyed to determine the vane pressure losses. For both corners the vane A turning vane configuration gave lower losses than the vane B configuration in the regions where the flow regime should be representative of two-dimensional flow. For both vane sets the vane loss coefficient increased rapidly near the walls.

  18. Knowing Where You Are: Using coastal observatories to design and interpret plankton surveys in the New York Bight Apex

    NASA Astrophysics Data System (ADS)

    Quinlan, J. A.; Manderson, J. P.; Shaheen, P.; Law, C. G.

    2004-12-01

    As part of LaTTE, the New York Bight Apex benefited from considerable integrated ocean observing system infrastructure. To apply this IOOS capability to a fisheries problem, a joint Rutgers-NOAA Fisheries pilot project was launched in June 2004 to conduct periodic hydroacoustic/ichthyoplankton surveys throughout the summer and into the autumn. These surveys were aimed at sampling important features (the Hudson River Plume, the Cold Pool, and shelf water) as they changed through time; identifying important water mass-community associations; and moving toward methods of Essential Fish Habitat determination for pelagic habitats. Here we present preliminary findings from the field effort, and outline our use of IOOS capability in fisheries research.

  19. Evaluation of Nine Consensus Indices in Delphi Foresight Research and Their Dependency on Delphi Survey Characteristics: A Simulation Study and Debate on Delphi Design and Interpretation.

    PubMed

    Birko, Stanislav; Dove, Edward S; Özdemir, Vural

    2015-01-01

    The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger's Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss' Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts' opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0.087 and 0.083, respectively). Scholars in technology design, foresight research and future(s) studies might consider these new findings in strategic planning of Delphi studies, for example, in rational choice of consensus indices and sample size, or accounting for confounding factors such as experts' variable degrees of conformity (stubbornness/flexibility) in modifying their opinions. PMID:26270647

  20. Evaluation of Nine Consensus Indices in Delphi Foresight Research and Their Dependency on Delphi Survey Characteristics: A Simulation Study and Debate on Delphi Design and Interpretation

    PubMed Central

    Birko, Stanislav; Dove, Edward S.; Özdemir, Vural

    2015-01-01

    The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger’s Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss’ Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts’ opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0.087 and 0.083, respectively). Scholars in technology design, foresight research and future(s) studies might consider these new findings in strategic planning of Delphi studies, for example, in rational choice of consensus indices and sample size, or accounting for confounding factors such as experts’ variable degrees of conformity (stubbornness/flexibility) in modifying their opinions. PMID:26270647

  1. An investigation into the feasibility of designing a framework for the quantitative evaluation of the Clinical Librarian service at an NHS Trust in Brighton, UK.

    PubMed

    Deshmukh, Archana; Roper, Tom

    2014-12-01

    This feature presents research undertaken by Archana Deshmukh for her MA dissertation at the University of Brighton. She worked closely with Tom Roper, the Clinical Librarian at Brighton and Sussex University Hospitals NHS Trust, in a project to explore the feasibility of applying quantitative measures to evaluate the Clinical Librarian service. The investigation used an innovative participatory approach and the findings showed that although an exclusively quantitative approach to evaluation is not feasible, using a mixed methods approach is a way forward. Agreed outputs and outcomes could be embedded in a marketing plan, and the resulting framework could provide evidence to demonstrate overall impact. Archana graduated in July 2014, gaining a Distinction in the MA in Information Studies, and she is currently looking for work in the health information sector. PMID:25443028

  2. Surveying determinants of protein structure designability across different energy models and amino-acid alphabets: A consensus

    NASA Astrophysics Data System (ADS)

    Buchler, Nicolas E. G.; Goldstein, Richard A.

    2000-02-01

    A variety of analytical and computational models have been proposed to answer the question of why some protein structures are more "designable" (i.e., have more sequences folding into them) than others. One class of analytical and statistical-mechanical models has approached the designability problem from a thermodynamic viewpoint. These models highlighted specific structural features important for increased designability. Furthermore, designability was shown to be inherently related to thermodynamically relevant energetic measures of protein folding, such as the foldability F and energy gap Δ10. However, many of these models have been done within a very narrow focus: Namely, pair-contact interactions and two-letter amino-acid alphabets. Recently, two-letter amino-acid alphabets for pair-contact models have been shown to contain designability artifacts which disappear for larger-letter amino-acid alphabets. In addition, a solvation model was demonstrated to give identical designability results to previous two-letter amino-acid alphabet pair-contact models. In light of these discordant results, this report synthesizes a broad consensus regarding the relationship between specific structural features, foldability F, energy gap Δ10, and structure designability for different energy models (pair-contact vs solvation) across a wide range of amino-acid alphabets. We also propose a novel measure Zdk which is shown to be well correlated to designability. Finally, we conclusively demonstrate that two-letter amino-acid alphabets for pair-contact models appear to be solvation models in disguise.

  3. Cartography at the U.S. Geological Survey: the National Mapping Division's cartographic programs, products, design, and technology

    USGS Publications Warehouse

    Ogrosky, Charles E.; Gwynn, William; Jannace, Richard

    1989-01-01

    The U.S. Geological Survey (USGS) is the prime source of many kinds of topographic and special-purpose maps of the United States and its outlying areas. It is also a prime source of digital map data. One main goal of the USGS is to provide large-scale topographic map coverage of the entire United States. Most of the Nation is already covered. We expect that initial coverage will be completed by 1991. For many purposes, many public agencies, private organizations, and individuals need reliable cartographic and geographic knowledge about our Nation. To serve such needs, all USGS maps are compiled to exacting standards of accuracy and content.

  4. Berkeley Quantitative Genome Browser

    Energy Science and Technology Software Center (ESTSC)

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation.more » The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.« less

  5. Berkeley Quantitative Genome Browser

    SciTech Connect

    Hechmer, Aaron

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation. The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.

  6. Design and Implementation of a Comprehensive Web-based Survey for Ovarian Cancer Survivorship with an Analysis of Prediagnosis Symptoms via Text Mining.

    PubMed

    Sun, Jiayang; Bogie, Kath M; Teagno, Joe; Sun, Yu-Hsiang Sam; Carter, Rebecca R; Cui, Licong; Zhang, Guo-Qiang

    2014-01-01

    Ovarian cancer (OvCa) is the most lethal gynecologic disease in the United States, with an overall 5-year survival rate of 44.5%, about half of the 89.2% for all breast cancer patients. To identify factors that possibly contribute to the long-term survivorship of women with OvCa, we conducted a comprehensive online Ovarian Cancer Survivorship Survey from 2009 to 2013. This paper presents the design and implementation of our survey, introduces its resulting data source, the OVA-CRADLE™ (Clinical Research Analytics and Data Lifecycle Environment), and illustrates a sample application of the survey and data by an analysis of prediagnosis symptoms, using text mining and statistics. The OVA-CRADLE™ is an application of our patented Physio-MIMI technology, facilitating Web-based access, online query and exploration of data. The prediagnostic symptoms and association of early-stage OvCa diagnosis with endometriosis provide potentially important indicators for future studies in this field. PMID:25861211

  7. Design and Implementation of a Comprehensive Web-based Survey for Ovarian Cancer Survivorship with an Analysis of Prediagnosis Symptoms via Text Mining

    PubMed Central

    Sun, Jiayang; Bogie, Kath M; Teagno, Joe; Sun, Yu-Hsiang (Sam); Carter, Rebecca R; Cui, Licong; Zhang, Guo-Qiang

    2014-01-01

    Ovarian cancer (OvCa) is the most lethal gynecologic disease in the United States, with an overall 5-year survival rate of 44.5%, about half of the 89.2% for all breast cancer patients. To identify factors that possibly contribute to the long-term survivorship of women with OvCa, we conducted a comprehensive online Ovarian Cancer Survivorship Survey from 2009 to 2013. This paper presents the design and implementation of our survey, introduces its resulting data source, the OVA-CRADLE™ (Clinical Research Analytics and Data Lifecycle Environment), and illustrates a sample application of the survey and data by an analysis of prediagnosis symptoms, using text mining and statistics. The OVA-CRADLE™ is an application of our patented Physio-MIMI technology, facilitating Web-based access, online query and exploration of data. The prediagnostic symptoms and association of early-stage OvCa diagnosis with endometriosis provide potentially important indicators for future studies in this field. PMID:25861211

  8. Quantitative measurement of the chemical composition of geological standards with a miniature laser ablation/ionization mass spectrometer designed for in situ application in space research

    NASA Astrophysics Data System (ADS)

    Neuland, M. B.; Grimaudo, V.; Mezger, K.; Moreno-García, P.; Riedo, A.; Tulej, M.; Wurz, P.

    2016-03-01

    A key interest of planetary space missions is the quantitative determination of the chemical composition of the planetary surface material. The chemical composition of surface material (minerals, rocks, soils) yields fundamental information that can be used to answer key scientific questions about the formation and evolution of the planetary body in particular and the Solar System in general. We present a miniature time-of-flight type laser ablation/ionization mass spectrometer (LMS) and demonstrate its capability in measuring the elemental and mineralogical composition of planetary surface samples quantitatively by using a femtosecond laser for ablation/ionization. The small size and weight of the LMS make it a remarkable tool for in situ chemical composition measurements in space research, convenient for operation on a lander or rover exploring a planetary surface. In the laboratory, we measured the chemical composition of four geological standard reference samples USGS AGV-2 Andesite, USGS SCo-l Cody Shale, NIST 97b Flint Clay and USGS QLO-1 Quartz Latite with LMS. These standard samples are used to determine the sensitivity factors of the instrument. One important result is that all sensitivity factors are close to 1. Additionally, it is observed that the sensitivity factor of an element depends on its electron configuration, hence on the electron work function and the elemental group in agreement with existing theory. Furthermore, the conformity of the sensitivity factors is supported by mineralogical analyses of the USGS SCo-l and the NIST 97b samples. With the four different reference samples, the consistency of the calibration factors can be demonstrated, which constitutes the fundamental basis for a standard-less measurement-technique for in situ quantitative chemical composition measurements on planetary surface.

  9. MALAYSIAN FAMILY LIFE SURVEY

    EPA Science Inventory

    The Malaysian Family Life Surveys (MFLS) comprise a pair of surveys with partially overlapping samples, designed by RAND and administered in Peninsular Malaysia in 1976-77 (MFLS-1) and 1988-89 (MFLS-2). Each survey collected detailed current and retrospective information on famil...

  10. The Introductory Sociology Survey

    ERIC Educational Resources Information Center

    Best, Joel

    1977-01-01

    The Introductory Sociology Survey (ISS) is designed to teach introductory students basic skills in developing causal arguments and in using a computerized statistical package to analyze survey data. Students are given codebooks for survey data and asked to write a brief paper predicting the relationship between at least two variables. (Author)

  11. A Retrospective Survey of Research Design and Statistical Analyses in Selected Chinese Medical Journals in 1998 and 2008

    PubMed Central

    Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia

    2010-01-01

    Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824

  12. State-of-the-art and dissemination of computational tools for drug-design purposes: a survey among Italian academics and industrial institutions.

    PubMed

    Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco

    2013-05-01

    During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted. PMID:23682568

  13. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  14. A survey of attitudes and factors associated with successful cardiopulmonary resuscitation (CPR) knowledge transfer in an older population most likely to witness cardiac arrest: design and methodology

    PubMed Central

    Vaillancourt, Christian; Grimshaw, Jeremy; Brehaut, Jamie C; Osmond, Martin; Charette, Manya L; Wells, George A; Stiell, Ian G

    2008-01-01

    Background Overall survival rates for out-of-hospital cardiac arrest rarely exceed 5%. While bystander cardiopulmonary resuscitation (CPR) can increase survival for cardiac arrest victims by up to four times, bystander CPR rates remain low in Canada (15%). Most cardiac arrest victims are men in their sixties, they usually collapse in their own home (85%) and the event is witnessed 50% of the time. These statistics would appear to support a strategy of targeted CPR training for an older population that is most likely to witness a cardiac arrest event. However, interest in CPR training appears to decrease with advancing age. Behaviour surrounding CPR training and performance has never been studied using well validated behavioural theories. Methods/Design The overall goal of this study is to conduct a survey to better understand the behavioural factors influencing CPR training and performance in men and women 55 years of age and older. The study will proceed in three phases. In phase one, semi-structured qualitative interviews will be conducted and recorded to identify common categories and themes regarding seeking CPR training and providing CPR to a cardiac arrest victim. The themes identified in the first phase will be used in phase two to develop, pilot-test, and refine a survey instrument based upon the Theory of Planned Behaviour. In the third phase of the project, the final survey will be administered to a sample of the study population over the telephone. Analyses will include measures of sampling bias, reliability of the measures, construct validity, as well as multiple regression analyses to identify constructs and beliefs most salient to seniors' decisions about whether to attend CPR classes or perform CPR on a cardiac arrest victim. Discussion The results of this survey will provide valuable insight into factors influencing the interest in CPR training and performance among a targeted group of individuals most susceptible to witnessing a victim in cardiac arrest. The findings can then be applied to the design of trials of various interventions designed to promote attendance at CPR classes and improve CPR performance. Trial registration ClinicalTrials.gov NCT00665288 PMID:18986547

  15. Obesity-related behaviours and BMI in five urban regions across Europe: sampling design and results from the SPOTLIGHT cross-sectional survey

    PubMed Central

    Lakerveld, Jeroen; Ben Rebah, Maher; Mackenbach, Joreintje D; Charreire, Hélène; Compernolle, Sofie; Glonti, Ketevan; Bardos, Helga; Rutter, Harry; De Bourdeaudhuij, Ilse; Brug, Johannes; Oppert, Jean-Michel

    2015-01-01

    Objectives To describe the design, methods and first results of a survey on obesity-related behaviours and body mass index (BMI) in adults living in neighbourhoods from five urban regions across Europe. Design A cross-sectional observational study in the framework of an European Union-funded project on obesogenic environments (SPOTLIGHT). Setting 60 urban neighbourhoods (12 per country) were randomly selected in large urban zones in Belgium, France, Hungary, the Netherlands and the UK, based on high or low values for median household income (socioeconomic status, SES) and residential area density. Participants A total of 6037 adults (mean age 52 years, 56% female) participated in the online survey. Outcome measures Self-reported physical activity, sedentary behaviours, dietary habits and BMI. Other measures included general health; barriers and motivations for a healthy lifestyle, perceived social and physical environmental characteristics; the availability of transport modes and their use to specific destinations; self-defined neighbourhood boundaries and items related to residential selection. Results Across five countries, residents from low-SES neighbourhoods ate less fruit and vegetables, drank more sugary drinks and had a consistently higher BMI. SES differences in sedentary behaviours were observed in France, with residents from higher SES neighbourhoods reporting to sit more. Residents from low-density neighbourhoods were less physically active than those from high-density neighbourhoods; during leisure time and (most pronounced) for transport (except for Belgium). BMI differences by residential density were inconsistent across all countries. Conclusions The SPOTLIGHT survey provides an original approach for investigating relations between environmental characteristics, obesity-related behaviours and obesity in Europe. First descriptive results indicate considerable differences in health behaviours and BMI between countries and neighbourhood types. PMID:26507356

  16. Identifying Influential Facilitators of Mathematics Professional Development: A Survey Analysis of Elementary School Teachers

    ERIC Educational Resources Information Center

    Linder, Sandra M.; Eckhoff, Angela; Igo, Larry B.; Stegelin, Dolores

    2013-01-01

    This paper builds on results from a previous phenomenological study examining characteristics of influential facilitators of elementary mathematics professional development. The current study utilized a survey design where results from the qualitative investigation were quantitized to develop an instrument that allowed participants to identify…

  17. The Math You Need, When You Need It: Student-Centered Web Resources Designed to Decrease Math Review and Increase Quantitative Geology in the Classroom

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Baer, E. M.

    2007-12-01

    Introductory geoscience courses are rife with quantitative concepts from graphing to rates to unit conversions. Recent research suggests that supplementary mathematical instruction increases post-secondary students' retention and performance in science courses. Nonetheless, many geoscience faculty feel that they do not have enough time to cover all the geoscience content, let alone covering the math they often feel students should have learned before reaching their classes. We present our NSF-funded effort to create web modules for students that address these concerns. Our web resources focus on both student performance and faculty time issues by building students' quantitative skills through web-based, self-paced modular tutorials. Each module can be assigned to individual students who have demonstrated on a pre-test that they are in need of supplemental instruction. The pre-test involves problems that place mathematical concepts in a geoscience context and determines the students who need the most support with these skills. Students needing support are asked to complete a three-pronged web-based module just before the concept is needed in class. The three parts of each tutorial include: an explanation of the mathematics, a page of practice problems and an on-line quiz that is graded and sent to the instructor. Each of the modules is steeped in best practices in mathematics and geoscience education, drawing on multiple contexts and utilizing technology. The tutorials also provide students with further resources so that they can explore the mathematics in more depth. To assess the rigor of this program, students are given the pre-test again at the end of the course. The uniqueness of this program lies in a rich combination of mathematical concepts placed in multiple geoscience contexts, giving students the opportunity to explore the way that math relates to the physical world. We present several preliminary modules dealing with topics common in introductory geoscience courses. We seek feedback from faculty teaching all levels of geoscience addressing several questions: In what math/geoscience topics do you feel students need supplemental instruction? Where do students come up against quantitative topics that make them drop the class or perform poorly? Would you be willing to review or help us to test these modules in your class?

  18. Web Survey Design in ASP.Net 2.0: A Simple Task with One Line of Code

    ERIC Educational Resources Information Center

    Liu, Chang

    2007-01-01

    Over the past few years, more and more companies have been investing in electronic commerce (EC) by designing and implementing Web-based applications. In the world of practice, the importance of using Web technology to reach individual customers has been presented by many researchers. This paper presents an easy way of conducting marketing

  19. Reflective Filters Design for Self-Filtering Narrowband Ultraviolet Imaging Experiment Wide-Field Surveys (NUVIEWS) Project

    NASA Technical Reports Server (NTRS)

    Park, Jung- Ho; Kim, Jongmin; Zukic, Muamer; Torr, Douglas G.

    1994-01-01

    We report the design of multilayer reflective filters for the self-filtering cameras of the NUVIEWS project. Wide angle self-filtering cameras were designed to image the C IV (154.9 nm) line emission, and H2 Lyman band fluorescence (centered at 161 nm) over a 20 deg x 30 deg field of view. A key element of the filter design includes the development of pi-multilayers optimized to provide maximum reflectance at 154.9 nm and 161 nm for the respective cameras without significant spectral sensitivity to the large cone angle of the incident radiation. We applied self-filtering concepts to design NUVIEWS telescope filters that are composed of three reflective mirrors and one folding mirror. The filters with narrowband widths of 6 and 8 rim at 154.9 and 161 nm, respectively, have net throughputs of more than 50 % with average blocking of out-of-band wavelengths better than 3 x 10(exp -4)%.

  20. Web Survey Design in ASP.Net 2.0: A Simple Task with One Line of Code

    ERIC Educational Resources Information Center

    Liu, Chang

    2007-01-01

    Over the past few years, more and more companies have been investing in electronic commerce (EC) by designing and implementing Web-based applications. In the world of practice, the importance of using Web technology to reach individual customers has been presented by many researchers. This paper presents an easy way of conducting marketing…

  1. Evaluating quantitative 3-D image analysis as a design tool for low enriched uranium fuel compacts for the transient reactor test facility: A preliminary study

    DOE PAGESBeta

    Kane, J. J.; van Rooyen, I. J.; Craft, A. E.; Roney, T. J.; Morrell, S. R.

    2016-02-05

    In this study, 3-D image analysis when combined with a non-destructive examination technique such as X-ray computed tomography (CT) provides a highly quantitative tool for the investigation of a material’s structure. In this investigation 3-D image analysis and X-ray CT were combined to analyze the microstructure of a preliminary subsized fuel compact for the Transient Reactor Test Facility’s low enriched uranium conversion program to assess the feasibility of the combined techniques for use in the optimization of the fuel compact fabrication process. The quantitative image analysis focused on determining the size and spatial distribution of the surrogate fuel particles andmore » the size, shape, and orientation of voids within the compact. Additionally, the maximum effect of microstructural features on heat transfer through the carbonaceous matrix of the preliminary compact was estimated. The surrogate fuel particles occupied 0.8% of the compact by volume with a log-normal distribution of particle sizes with a mean diameter of 39 μm and a standard deviation of 16 μm. Roughly 39% of the particles had a diameter greater than the specified maximum particle size of 44 μm suggesting that the particles agglomerate during fabrication. The local volume fraction of particles also varies significantly within the compact although uniformities appear to be evenly dispersed throughout the analysed volume. The voids produced during fabrication were on average plate-like in nature with their major axis oriented perpendicular to the compaction direction of the compact. Finally, the microstructure, mainly the large preferentially oriented voids, may cause a small degree of anisotropy in the thermal diffusivity within the compact. α∥/α⊥, the ratio of thermal diffusivities parallel to and perpendicular to the compaction direction are expected to be no less than 0.95 with an upper bound of 1.« less

  2. Computational vaccinology: quantitative approaches.

    PubMed

    Flower, Darren R; McSparron, Helen; Blythe, Martin J; Zygouri, Christianna; Taylor, Debra; Guan, Pingping; Wan, Shouzhan; Coveney, Peter V; Walshe, Valerie; Borrow, Persephone; Doytchinova, Irini A

    2003-01-01

    The immune system is hierarchical and has many levels, exhibiting much emergent behaviour. However, at its heart are molecular recognition events that are indistinguishable from other types of biomacromolecular interaction. These can be addressed well by quantitative experimental and theoretical biophysical techniques, and particularly by methods from drug design. We review here our approach to computational immunovaccinology. In particular, we describe the JenPep database and two new techniques for T cell epitope prediction. One is based on quantitative structure-activity relationships (a 3D-QSAR method based on CoMSIA and another 2D method based on the Free-Wilson approach) and the other on atomistic molecular dynamic simulations using high performance computing. JenPep (http://www.jenner.ar.uk/ JenPep) is a relational database system supporting quantitative data on peptide binding to major histocompatibility complexes, TAP transporters, TCR-pMHC complexes, and an annotated list of B cell and T cell epitopes. Our 2D-QSAR method factors the contribution to peptide binding from individual amino acids as well as 1-2 and 1-3 residue interactions. In the 3D-QSAR approach, the influence of five physicochemical properties (volume, electrostatic potential, hydrophobicity, hydrogen-bond donor and acceptor abilities) on peptide affinity were considered. Both methods are exemplified through their application to the well-studied problem of peptide binding to the human class I MHC molecule HLA-A*0201. PMID:14712934

  3. The U. S. Geological Survey's Albemarle-Pamlico National Water-Quality Assessment Study; background and design

    USGS Publications Warehouse

    Spruill, T.B.; Harned, Douglas A.; McMahon, Gerard

    1995-01-01

    The Albemarle-Pamlico Study Unit is one of 20 National Water-Quality Assessment (NAWQA) studies begun in 1991 by the U.S. Geological Survey (USGS) to assess the Nation's water quality. One of the missions of the USGS is to assess the quantity and quality of the Nation's water resources. The NAWQA program was established to help accomplish this mission. The Albemarle-Pamlico Study Unit, located in Virginia and North Carolina, drains an area of about 28,000 square miles. Four major rivers, the Chowan, the Roanoke, the Tar-Pamlico and the Neuse, all drain into the Albemarle-Pamlico Sound in North Carolina. Four physiographic regions (areas of homogeneous climatic, geologic, and biological characteristics), the Valley and Ridge, Blue Ridge, Piedmont and Coastal Plain Physiographic Provinces are included within the Albemarle-Pamlico Study Unit. Until 1991, there was no single program that could answer the question, 'Are the Nation's ground and surface waters getting better, worse, or are they staying the same?' A program was needed to evaluate water quality by using standard techniques to allow assessment of water quality at local, regional, and national scales. The NAWQA Program was implemented to answer questions about the Nation's water quality using consistent and comparable methods. A total of 60 basins, or study units, will be in place by 1997 to assess the Nation's water quality.

  4. Application of best-fit survey techniques throughout design, manufacturing and installation of the MKII divertor at JET

    SciTech Connect

    Macklin, B.; Celentano, G.; Tait, J.; Lente, E. van; Brade, R.; Shaw, R.

    1995-12-31

    The precise installation and alignment of large components in an activated and beryllium contaminated fusion device is a problem which must be faced in JET as well as future devices such as ITER. To guarantee the successful alignment of the MKII Divertor in JET it was essential that, early in the design phase, realistic manufacturing and installation tolerances and restrictions were identified and considered. The main components of the MKII divertor structure are an inner and outer ring mounted on a base plate. The overall diameter of the assembly is 6m and is dismantled into 24 sub-assemblies for installation. The structure must be installed very accurately whilst wearing full pressurized suits. As the other major in-vessel components remain unchanged it is important that the new divertor be installed to the same center as these components. Major considerations in the design process were the installation accuracy required, the installation method and restrictions imposed by the existing in-vessel structure. Joints between modules could only be made from one side due to access restrictions. Design of the support system had to be such that minimal modification to the existing in-vessel structure would be required. The tight tolerances necessary to ensure the mechanical integrity of the module joints were compromised by the necessity to have realistic assembly tolerances.

  5. Back to the basics: Maximizing the information obtained by quantitative two dimensional gel electrophoresis analyses by an appropriate experimental design and statistical analyses.

    PubMed

    Valledor, Luis; Jorrín, Jesús

    2011-01-01

    Two dimensional gel electrophoresis has been one of the techniques most used for protein separation in proteomics experiments and still continues to be so for some species such as plants. Despite the constant technical advances and continuous improvements in the field of 2-DE, the experimental design and analysis of protein abundance data continue to be ignored or not properly documented in the literature. An appropriate experimental design, followed by decisive statistical methods is mandatory to extract all the information that is concealed in the complexity of 2-DE data. In this work we review, in a biologist's language, all the experimental design and statistical tests to be considered while planning a 2-DE based proteomics experiment and for the correct analysis and interpretation of the data. We aim to provide the researcher with an up to date introduction to these areas, starting with the experimental design and ending with the application of multivariate statistical methodologies such as PCA, ICA or neural network-based self-organizing maps. In between we have described, in an understandable way, the current methodologies available to deal with all the stages of the experimental design, data processing and analysis. PMID:20656082

  6. Design

    ERIC Educational Resources Information Center

    Buchanan, Richard; Cross, Nigel; Durling, David; Nelson, Harold; Owen, Charles; Valtonen, Anna; Boling, Elizabeth; Gibbons, Andrew; Visscher-Voerman, Irene

    2013-01-01

    Scholars representing the field of design were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Richard Buchanan, Nigel Cross, David Durling, Harold Nelson, Charles Owen, and Anna Valtonen. Scholars…

  7. Design

    ERIC Educational Resources Information Center

    Buchanan, Richard; Cross, Nigel; Durling, David; Nelson, Harold; Owen, Charles; Valtonen, Anna; Boling, Elizabeth; Gibbons, Andrew; Visscher-Voerman, Irene

    2013-01-01

    Scholars representing the field of design were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Richard Buchanan, Nigel Cross, David Durling, Harold Nelson, Charles Owen, and Anna Valtonen. Scholars

  8. City Governments and Aging in Place: Community Design, Transportation and Housing Innovation Adoption

    ERIC Educational Resources Information Center

    Lehning, Amanda J.

    2012-01-01

    Purpose of the study: To examine the characteristics associated with city government adoption of community design, housing, and transportation innovations that could benefit older adults. Design and methods: A mixed-methods study with quantitative data collected via online surveys from 62 city planners combined with qualitative data collected via…

  9. City Governments and Aging in Place: Community Design, Transportation and Housing Innovation Adoption

    ERIC Educational Resources Information Center

    Lehning, Amanda J.

    2012-01-01

    Purpose of the study: To examine the characteristics associated with city government adoption of community design, housing, and transportation innovations that could benefit older adults. Design and methods: A mixed-methods study with quantitative data collected via online surveys from 62 city planners combined with qualitative data collected via

  10. The path of placement of a removable partial denture: a microscope based approach to survey and design

    PubMed Central

    2015-01-01

    This article reviews the topic of how to identify and develop a removable partial denture (RPD) path of placement, and provides a literature review of the concept of the RPD path of placement, also known as the path of insertion. An optimal RPD path of placement, guided by mutually parallel guide planes, ensures that the RPD flanges fit intimately over edentulous ridge structures and that the framework fits intimately with guide plane surfaces, which prevents food collecting empty spaces between the intaglio surface of the framework and intraoral surfaces, and ensures that RPD clasps engage adequate numbers of tooth undercuts to ensure RPD retention. The article covers topics such as the causes of obstructions to RPD intra-oral seating, the causes of food collecting empty spaces that may exist around an RPD, and how to identify if a guide plane is parallel with the projected RPD path of placement. The article presents a method of using a surgical operating microscope, or high magnification (6-8x or greater) binocular surgical loupes telescopes, combined with co-axial illumination, to identify a preliminary path of placement for an arch. This preliminary path of placement concept may help to guide a dentist or a dental laboratory technician when surveying a master cast of the arch to develop an RPD path of placement, or in verifying that intra-oral contouring has aligned teeth surfaces optimally with the RPD path of placement. In dentistry, a well-fitting RPD reduces long-term periodontal or structural damage to abutment teeth. PMID:25722842

  11. The path of placement of a removable partial denture: a microscope based approach to survey and design.

    PubMed

    Mamoun, John Sami

    2015-02-01

    This article reviews the topic of how to identify and develop a removable partial denture (RPD) path of placement, and provides a literature review of the concept of the RPD path of placement, also known as the path of insertion. An optimal RPD path of placement, guided by mutually parallel guide planes, ensures that the RPD flanges fit intimately over edentulous ridge structures and that the framework fits intimately with guide plane surfaces, which prevents food collecting empty spaces between the intaglio surface of the framework and intraoral surfaces, and ensures that RPD clasps engage adequate numbers of tooth undercuts to ensure RPD retention. The article covers topics such as the causes of obstructions to RPD intra-oral seating, the causes of food collecting empty spaces that may exist around an RPD, and how to identify if a guide plane is parallel with the projected RPD path of placement. The article presents a method of using a surgical operating microscope, or high magnification (6-8x or greater) binocular surgical loupes telescopes, combined with co-axial illumination, to identify a preliminary path of placement for an arch. This preliminary path of placement concept may help to guide a dentist or a dental laboratory technician when surveying a master cast of the arch to develop an RPD path of placement, or in verifying that intra-oral contouring has aligned teeth surfaces optimally with the RPD path of placement. In dentistry, a well-fitting RPD reduces long-term periodontal or structural damage to abutment teeth. PMID:25722842

  12. Use of Web and In-Person Survey Modes to Gather Data from Young Adults on Sex and Drug Use: An Evaluation of Cost, Time, and Survey Error Based on a Randomized Mixed-Mode Design

    ERIC Educational Resources Information Center

    McMorris, Barbara J.; Petrie, Renee S.; Catalano, Richard F.; Fleming, Charles B.; Haggerty, Kevin P.; Abbott, Robert D.

    2009-01-01

    In a randomized test of mixed-mode data collection strategies, 386 participants in the Raising Healthy Children (RHC) Project were either (a) asked to complete a survey via the Internet and later offered the opportunity to complete the survey in person or (b) first offered an in-person survey, with the Web follow-up. The Web-first condition…

  13. Study Quality in SLA: A Cumulative and Developmental Assessment of Designs, Analyses, Reporting Practices, and Outcomes in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2011-01-01

    I began this study with two assumptions. Assumption 1: Study quality matters. If the means by which researchers design, carry out, and report on their studies lack in rigor or transparency, theory and practice are likely to be misguided or at least decelerated. Assumption 2 is an implication of Assumption 1: Quality should be measured rather than…

  14. Three-dimensional quantitative structure-activity relationships and docking studies of some structurally diverse flavonoids and design of new aldose reductase inhibitors.

    PubMed

    Chandra De, Utpal; Debnath, Tanusree; Sen, Debanjan; Debnath, Sudhan

    2015-01-01

    Aldose reductase (AR) plays an important role in the development of several long-term diabetic complications. Inhibition of AR activities is a strategy for controlling complications arising from chronic diabetes. Several AR inhibitors have been reported in the literature. Flavonoid type compounds are shown to have significant AR inhibition. The objective of this study was to perform a computational work to get an idea about structural insight of flavonoid type compounds for developing as well as for searching new flavonoid based AR inhibitors. The data-set comprising 68 flavones along with their pIC50 values ranging from 0.44 to 4.59 have been collected from literature. Structure of all the flavonoids were drawn in Chembiodraw Ultra 11.0, converted into corresponding three-dimensional structure, saved as mole file and then imported to maestro project table. Imported ligands were prepared using LigPrep option of maestro 9.6 version. Three-dimensional quantitative structure-activity relationships and docking studies were performed with appropriate options of maestro 9.6 version installed in HP Z820 workstation with CentOS 6.3 (Linux). A model with partial least squares factor 5, standard deviation 0.2482, R(2) = 0.9502 and variance ratio of regression 122 has been found as the best statistical model. PMID:25709964

  15. Measuring health literacy in populations: illuminating the design and development process of the European Health Literacy Survey Questionnaire (HLS-EU-Q)

    PubMed Central

    2013-01-01

    Background Several measurement tools have been developed to measure health literacy. The tools vary in their approach and design, but few have focused on comprehensive health literacy in populations. This paper describes the design and development of the European Health Literacy Survey Questionnaire (HLS-EU-Q), an innovative, comprehensive tool to measure health literacy in populations. Methods Based on a conceptual model and definition, the process involved item development, pre-testing, field-testing, external consultation, plain language check, and translation from English to Bulgarian, Dutch, German, Greek, Polish, and Spanish. Results The development process resulted in the HLS-EU-Q, which entailed two sections, a core health literacy section and a section on determinants and outcomes associated to health literacy. The health literacy section included 47 items addressing self-reported difficulties in accessing, understanding, appraising and applying information in tasks concerning decisions making in healthcare, disease prevention, and health promotion. The second section included items related to, health behaviour, health status, health service use, community participation, socio-demographic and socio-economic factors. Conclusions By illuminating the detailed steps in the design and development process of the HLS-EU-Q, it is the aim to provide a deeper understanding of its purpose, its capability and its limitations for others using the tool. By stimulating a wide application it is the vision that HLS-EU-Q will be validated in more countries to enhance the understanding of health literacy in different populations. PMID:24112855

  16. [Dietary and lifestyle-induced diseases in children. Design, examination modules and study population of the baseline survey of the German IDEFICS cohort].

    PubMed

    Hebestreit, A; Ahrens, W

    2012-06-01

    The European IDEFICS (Identification and Prevention of Dietary- and Lifestyle-induced Health Effects in Children and Infants) Study investigates risk factors of diet- and lifestyle-related diseases in children focusing on overweight, obesity and related metabolic co-morbidities based on a standardized study protocol. In parallel, the IDEFICS study developed, implemented and evaluated strategies for the primary prevention of diet- and lifestyle-related diseases in a controlled, community-oriented design. The prospective cohort study started with a baseline survey from September 2007 to May 2008 in eight European countries, with Germany among them. During the first survey 2,065 German children aged 2-9 years passed a comprehensive examination program. Their parents answered questions on sociodemographic characteristics; media consumption; dietary, activity and sleep patterns; as well as family life and the residential environment. The results of the study will contribute to the development of harmonized European guidelines on diet and lifestyle for health promotion and disease prevention in children. PMID:22736172

  17. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  18. Establishment of a 100-seed weight quantitative trait locus-allele matrix of the germplasm population for optimal recombination design in soybean breeding programmes.

    PubMed

    Zhang, Yinghu; He, Jianbo; Wang, Yufeng; Xing, Guangnan; Zhao, Jinming; Li, Yan; Yang, Shouping; Palmer, R G; Zhao, Tuanjie; Gai, Junyi

    2015-10-01

    A representative sample comprising 366 accessions from the Chinese soybean landrace population (CSLRP) was tested under four growth environments for determination of the whole-genome quantitative trait loci (QTLs) system of the 100-seed weight trait (ranging from 4.59g to 40.35g) through genome-wide association study (GWAS). A total of 116 769 single nucleotide polymorphisms (SNPs) were identified and organized into 29 121 SNP linkage disequilibrium blocks (SNPLDBs) to fit the property of multiple alleles/haplotypes per locus in germplasm. An innovative two-stage GWAS was conducted using a single locus model for shrinking the marker number followed by a multiple loci model utilizing a stepwise regression for the whole-genome QTL identification. In total, 98.45% of the phenotypic variance (PV) was accounted for by four large-contribution major QTLs (36.33%), 51 small-contribution major QTLs (43.24%), and a number of unmapped minor QTLs (18.88%), with the QTL×environment variance representing only 1.01% of the PV. The allele numbers of each QTL ranged from two to 10. A total of 263 alleles along with the respective allele effects were estimated and organized into a 263×366 matrix, giving the compact genetic constitution of the CSLRP. Differentiations among the ecoregion matrices were found. No landrace had alleles which were all positive or all negative, indicating a hidden potential for recombination. The optimal crosses within and among ecoregions were predicted, and showed great transgressive potential. From the QTL system, 39 candidate genes were annotated, of which 26 were involved with the gene ontology categories of biological process, cellular component, and molecular function, indicating that diverse genes are involved in directing the 100-seed weight. PMID:26163701

  19. Hydrophilic interaction liquid chromatography-tandem mass spectrometry quantitative method for the cellular analysis of varying structures of gemini surfactants designed as nanomaterial drug carriers.

    PubMed

    Donkuru, McDonald; Michel, Deborah; Awad, Hanan; Katselis, George; El-Aneed, Anas

    2016-05-13

    Diquaternary gemini surfactants have successfully been used to form lipid-based nanoparticles that are able to compact, protect, and deliver genetic materials into cells. However, what happens to the gemini surfactants after they have released their therapeutic cargo is unknown. Such knowledge is critical to assess the quality, safety, and efficacy of gemini surfactant nanoparticles. We have developed a simple and rapid liquid chromatography electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) method for the quantitative determination of various structures of gemini surfactants in cells. Hydrophilic interaction liquid chromatography (HILIC) was employed allowing for a short simple isocratic run of only 4min. The lower limit of detection (LLOD) was 3ng/mL. The method was valid to 18 structures of gemini surfactants belonging to two different structural families. A full method validation was performed for two lead compounds according to USFDA guidelines. The HILIC-MS/MS method was compatible with the physicochemical properties of gemini surfactants that bear a permanent positive charge with both hydrophilic and hydrophobic elements within their molecular structure. In addition, an effective liquid-liquid extraction method (98% recovery) was employed surpassing previously used extraction methods. The analysis of nanoparticle-treated cells showed an initial rise in the analyte intracellular concentration followed by a maximum and a somewhat more gradual decrease of the intracellular concentration. The observed intracellular depletion of the gemini surfactants may be attributable to their bio-transformation into metabolites and exocytosis from the host cells. Obtained cellular data showed a pattern that grants additional investigations, evaluating metabolite formation and assessing the subcellular distribution of tested compounds. PMID:27086283

  20. DRAFT - Design of Radiological Survey and Sampling to Support Title Transfer or Lease of Property on the Department of Energy Oak Ridge Reservation

    SciTech Connect

    Cusick L.T.

    2002-09-25

    The U.S. Department of Energy (DOE) owns, operates, and manages the buildings and land areas on the Oak Ridge Reservation (ORR) in Oak Ridge, Tennessee. As land and buildings are declared excess or underutilized, it is the intent of DOE to either transfer the title of or lease suitable property to the Community Reuse Organization of East Tennessee (CROET) or other entities for public use. It is DOE's responsibility, in coordination with the U.S. Environmental Protection Agency (EPA), Region 4, and the Tennessee Department of Environment and Conservation (TDEC), to ensure that the land, facilities, and personal property that are to have the title transferred or are to be leased are suitable for public use. Release of personal property must also meet site requirements and be approved by the DOE contractor responsible for site radiological control. The terms title transfer and lease in this document have unique meanings. Title transfer will result in release of ownership without any restriction or further control by DOE. Under lease conditions, the government retains ownership of the property along with the responsibility to oversee property utilization. This includes involvement in the lessee's health, safety, and radiological control plans and conduct of site inspections. It may also entail lease restrictions, such as limiting access to certain areas or prohibiting digging, drilling, or disturbing material under surface coatings. Survey and sampling requirements are generally more rigorous for title transfer than for lease. Because of the accelerated clean up process, there is an increasing emphasis on title transfers of facilities and land. The purpose of this document is to describe the radiological survey and sampling protocols that are being used for assessing the radiological conditions and characteristics of building and land areas on the Oak Ridge Reservation that contain space potentially available for title transfer or lease. After necessary surveys and sampling and laboratory analyses are completed, the data are analyzed and included in an Environmental Baseline Summary (EBS) report for title transfer or in a Baseline Environmental Analysis Report (BEAR) for lease. The data from the BEAR is then used in a Screening-Level Human Health Risk Assessment (SHHRA) or a risk calculation (RC) to assess the potential risks to future owners/occupants. If title is to be transferred, release criteria in the form of specific activity concentrations called Derived Concentration Guideline Levels (DCGLs) will be developed for the each property. The DCGLs are based on the risk model and are used with the data in the EBS to determine, with statistical confidence, that the release criteria for the property have been met. The goal of the survey and sampling efforts is to (1) document the baseline conditions of the property (real or personal) prior to title transfer or lease, (2) obtain enough information that an evaluation of radiological risks can be made, and (3) collect sufftcient data so that areas that contain minimal residual levels of radioactivity can be identified and, following radiological control procedures, be released from radiological control. (It should be noted that release from radiological control does not necessarily mean free release because DOE may maintain institutional control of the site after it is released from radiological control). To meet the goals of this document, a Data Quality Objective (DQO) process will be used to enhance data collection efficiency and assist with decision-making. The steps of the DQO process involve stating the problem, identifying the decision, identifying inputs to the decision, developing study boundaries, developing the decision rule, and optimizing the design. This document describes the DQOs chosen for surveys and sampling efforts performed for the purposes listed above. The previous version to this document focused on the requirements for radiological survey and sampling protocols that are be used for leasing. Because the primary focus at this time is on title transfer, this revision applies to both situations.